
Hi Hugh,
Le jeu. 4 juin 2020 à 02:15, Hugh Fisher <hugo.fisher@gmail.com> a écrit :
If you can provide backward compatibility, the obvious question for a C extension developer is WHY ARE YOU CHANGING THE API IN THE FIRST PLACE?
It's explain in my draft PEP: https://github.com/vstinner/misc/blob/master/cpython/pep-opaque-c-api.rst
I'm working on the PEP to get it accepted :-) Yeah, it would help to be able to mention a PEP number rather than BPO issues.
Now I happen to follow the Python dev and CAPI mailing lists, so yes I do know why you want to change the API, you don't have to explain it to me. But I can imagine this being interpreted as the all too common "churn" rather than actual improvement, breaking people's code just because some dev wants to change stuff.
I'm not sure where is the best place to communicate on the rationale behind changes.
Currently, the "C API Changes: Porting to Python 3.10" section of What's New in Python 3.10 contains links to the bpo issue which introduced the change: https://github.com/python/cpython/blob/master/Doc/whatsnew/3.10.rst#porting-...
For example, Py_SET_xxx() new macros are linked to https://bugs.python.org/issue39573: the first message of this issue explains the rationale for this incompatible change.
Note: My expectation is that only a few C extensions use Py_TYPE() or Py_SIZE() as l-value. If we consider that these changes are breaking too many C extension modules in the wild during the Python 3.10 dev cycle, we can restrict the change to the limited C API. I did something similar in Python 3.9: we revert incompatible changes which broke too many third party projects. Sadly, there is no tool yet to estimate how many third party projects are broken by incompatible changes.
At Red Hat, we are rebuilding the whole Fedora operating system with a newer Python version, alpha and then beta versions. We discover packages broken by the newer Python. It gives me feedback to take smarter decision about incompatible changes.
I also have a https://github.com/vstinner/pythonci toy project to test a very low number of projects (4 projects: coverage, cython, jinja, numpy).
My suggestion is that for the first, or first few, versions of Python with the new API, this backwards compatibility header and any functions be shipped as part of the standard library. Extension developers can either update their code to the new API, or add a new line #include <PythonCompat.h> and everything works but they get a nag message about needing to update.
I don't see how it is possible to emit a deprecation warning when a Py_TYPE() is used as an l-value, but no warning if it's used as a r-value. Warning on "Py_TYPE(obj) = type;" but no warning on "type = Py_TYPE(obj);". This change is really a corner case.
I added Py_SET_xxx() functions in Python 3.9 on purpose and only introduced the backward incompatible change in Python 3.10. So Python 3.9 is the transition release.
The problem is how to get Py_SET_xxx() functions on Python 3.8 and older?
If Python 3.10 ships a new PythonCompat.h, it will only be available on Python 3.10, not for Python 3.9 and older. Also, I would like to update PythonCompat.h frequently. Old versions of Python, like Python 3.6, only get security fixes. I would prefer to distribute it separately.
Victor
Night gathers, and now my watch begins. It shall not end until my death.