Py_TYPE() and Py_SIZE() become static inline functions
Hi,
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
You can use the upgrade_pythoncapi.py script of my pythoncapi_compat project which does these changes for you: you just have to copy pythoncapi_compat.h to your project. This header file provides new C API functions like Py_NewRef() and Py_SET_TYPE() to old Python versions, Python 2.7-3.11.
=> https://github.com/pythoncapi/pythoncapi_compat
I already converted Py_TYPE() and Py_SIZE() macros in Python 3.10, but it broke too many C extensions and so I had to revert the change. In the meanwhile, I updated many C extensions and created the pythoncapi_compat project. For example, Cython and numpy have been updated to use Py_SET_TYPE() and Py_SET_SIZE(). Mercurial and immutables projects now use pythoncapi_compat.
I'm interested by feedback on my pythoncapi_compat project ;-)
Tell me if you need help to update your project for Python 3.11 C API changes: https://docs.python.org/dev/whatsnew/3.11.html#c-api-changes
Victor
Night gathers, and now my watch begins. It shall not end until my death.
Hi Victor,
I believe that such breaking changes need to follow the standard PEP 387 procedure.
While your pythoncapi_compat project is nice, I don't think C extension writers will appreciate breaking changes in new releases without the two minor release announcement and warning phase.
This is especially important with the new shorter release cycle.
For the C API, a compiler warning should probably be issued instead of a Python warning.
Cheers.
On 08.09.2021 13:49, Victor Stinner wrote:
Hi,
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
You can use the upgrade_pythoncapi.py script of my pythoncapi_compat project which does these changes for you: you just have to copy pythoncapi_compat.h to your project. This header file provides new C API functions like Py_NewRef() and Py_SET_TYPE() to old Python versions, Python 2.7-3.11.
=> https://github.com/pythoncapi/pythoncapi_compat
I already converted Py_TYPE() and Py_SIZE() macros in Python 3.10, but it broke too many C extensions and so I had to revert the change. In the meanwhile, I updated many C extensions and created the pythoncapi_compat project. For example, Cython and numpy have been updated to use Py_SET_TYPE() and Py_SET_SIZE(). Mercurial and immutables projects now use pythoncapi_compat.
I'm interested by feedback on my pythoncapi_compat project ;-)
Tell me if you need help to update your project for Python 3.11 C API changes: https://docs.python.org/dev/whatsnew/3.11.html#c-api-changes
Victor
Night gathers, and now my watch begins. It shall not end until my death.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: mal@egenix.com
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Sep 08 2021)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/
Hi,
On Wed, Sep 8, 2021 at 2:10 PM Marc-Andre Lemburg <mal@egenix.com> wrote:
I believe that such breaking changes need to follow the standard PEP 387 procedure.
While your pythoncapi_compat project is nice, I don't think C extension writers will appreciate breaking changes in new releases without the two minor release announcement and warning phase.
This is especially important with the new shorter release cycle.
Emitting compiler warnings was discussed multiple times, but so far, nobody manages to write a macro which only emits a warning if it's used as an l-value "Py_TYPE(...) = ...".
Py_REFCNT() macro was already converted to a static inline function in Python 3.10. This change went fine. It only broke like 1 or 2 projects.
By the way, pip hides all compiler output and most developers ignore compiler warnings.
Victor
Night gathers, and now my watch begins. It shall not end until my death.
On 08.09.2021 14:40, Victor Stinner wrote:
Hi,
On Wed, Sep 8, 2021 at 2:10 PM Marc-Andre Lemburg <mal@egenix.com> wrote:
I believe that such breaking changes need to follow the standard PEP 387 procedure.
While your pythoncapi_compat project is nice, I don't think C extension writers will appreciate breaking changes in new releases without the two minor release announcement and warning phase.
This is especially important with the new shorter release cycle.
Emitting compiler warnings was discussed multiple times, but so far, nobody manages to write a macro which only emits a warning if it's used as an l-value "Py_TYPE(...) = ...".
Yes, that's a difficult one, indeed.
Py_REFCNT() macro was already converted to a static inline function in Python 3.10. This change went fine. It only broke like 1 or 2 projects.
As you mentioned yourself, the Py_TYPE() change does break quite a few projects.
FWIW: I also believe that we should stick to PEP 387 and not override it for C API changes. The C extension universe is what makes Python so attractive, so special care must be taken.
By the way, pip hides all compiler output and most developers ignore compiler warnings.
During development the warnings are visible and that's the audience we want to reach. Not the users installing extensions via pip.
There's nothing much we can do if developers ignore those warnings, but at least we have warned them :-)
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Sep 08 2021)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/
On 08. 09. 21 14:40, Victor Stinner wrote:
Hi,
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
Why is this needed?
On Wed, Sep 8, 2021 at 2:10 PM Marc-Andre Lemburg <mal@egenix.com> wrote:
I believe that such breaking changes need to follow the standard PEP 387 procedure.
While your pythoncapi_compat project is nice, I don't think C extension writers will appreciate breaking changes in new releases without the two minor release announcement and warning phase.
This is especially important with the new shorter release cycle.
Emitting compiler warnings was discussed multiple times, but so far, nobody manages to write a macro which only emits a warning if it's used as an l-value "Py_TYPE(...) = ...".
So, since the steps from PEP 387 can't be followed, this should get an exception from the SC.
Py_REFCNT() macro was already converted to a static inline function in Python 3.10. This change went fine. It only broke like 1 or 2 projects. > By the way, pip hides all compiler output and most developers ignore compiler warnings.
If that's an issue, the instructions in PEP 387 are not adequate and the PEP should be amended.
On Wed, Sep 8, 2021 at 3:49 PM Petr Viktorin <encukou@gmail.com> wrote:
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
Why is this needed?
Read the issue for the rationale: https://bugs.python.org/issue39573#msg361513
See also: https://www.python.org/dev/peps/pep-0620/
Victor
Night gathers, and now my watch begins. It shall not end until my death.
On 08. 09. 21 17:21, Victor Stinner wrote:
On Wed, Sep 8, 2021 at 3:49 PM Petr Viktorin <encukou@gmail.com> wrote:
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
Why is this needed?
Read the issue for the rationale: https://bugs.python.org/issue39573#msg361513
So, this is still only needed for experiments like tagged pointers or specialized lists.
I don't think it's needed to change API for these experiments. You need to convert any extensions that you use in benchmarking, and build them with an experimental Python. But I think we should only force the change on users if the experiment turns out successful.
It also looks useful for other APIs like HPy, and the situation is similar:
- projects that want to support the other API can change
- proponents of the other API can submit PRs for projects they need
To get those PRs merged, it would help if we officially document that assigning to Py_TYPE/Py_SIZE is discouraged. I'm all for that. But I don't see how the cost of disallowing it entirely, and forcing all users to adapt *now*, justifies the benefits.
I see it's been a draft for 2 years; are you still planning to actually submit it?
On Wed, Sep 8, 2021 at 9:02 PM Petr Viktorin <encukou@gmail.com> wrote:
So, this is still only needed for experiments like tagged pointers or specialized lists.
There is also some value to CPython itself in making (especially) the PyType memory structure opaque, which is that it allows CPython itself to evolve that structure.
Following a deprecation process makes sense, but the likely disruption is quite small, the fix for libraries is easy, and the value to CPython itself is real. Waiting to start the deprecation process until some project shows massive benefit just makes it harder for those projects to get off the ground and delays the day when they might be fully adopted.
On 9/9/21 12:32 am, Simon Cross wrote:
There is also some value to CPython itself in making (especially) the PyType memory structure opaque, which is that it allows CPython itself to evolve that structure.
How does this change move the needle on that goal? The macros become static inline functions defined in the C headers. Nothing changes in the C struct itself. The fields still need to be accessible, whether the access is via a macro or via a static inline function: the goal is that both compile to a (ptr + offset). This enables better compiler error messages, but does not move any closer to moving field access into a function call required for a truly opaque memory structure. Changing to a function call would have performance implications that would probably prevent its adoption. Or maybe I misunderstand what you mean by an opaque memory structure?
Matti
On Wed, Sep 8, 2021 at 11:50 PM Matti Picus <matti.picus@gmail.com> wrote:
How does this change move the needle on that goal? The macros become static inline functions defined in the C headers. Nothing changes in the C struct itself.
Agreed. My point is that at the level of the C code itself there *is* a function Py_SET_TYPE(obj, type) that needs to be called and one can't just write "Py_TYPE(obj) = type" which effectively limits how setting a type works to a very particular implementation.
Yes, this change by itself doesn't make PyType completely opaque, but that is the direction that Victor is trying to head in I believe.
Hi Simon and Matti,
Making all C structures of Python opaque is a long term goal and it cannot be done in a single step.
Converting macros to static inline functions prepares the C API for opaque function calls, like: PyTypeObject* Py_TYPE(PyObject *op). In CPython, we can likely keep static inline functions (for best performance).
The advantage of using static inline functions is that other Python implementations have more freedom on how they implement Py_TYPE() and Py_SET_TYPE(). They don't have to implement Python objects exactly the same way than CPython does anymore.
Currently, it's simply impossible to implement the Python C API without copying CPython PyObject structure.
For me, the short term goal is to put abstractions on all ways to access Python objects: no longer access directly structure members, but only go through macro, static inline functions, or functions.
In Python 3.9, I addd PyFrame_GetBack() getter function to abstract access to PyFrameObject.f_back. In Python 3.11, if you don't use this function, your C extension will simply break since the PyFrameObject.f_back attribute has been removed. This removal is motivated by optimizing CPython.
We cannot evolve Python (fix bugs, optimize it) without this annoying work of fixing the C API: add more abstractions on top of C structures.
Victor
On Thu, Sep 9, 2021 at 1:12 AM Simon Cross <hodgestar@gmail.com> wrote:
On Wed, Sep 8, 2021 at 11:50 PM Matti Picus <matti.picus@gmail.com> wrote:
How does this change move the needle on that goal? The macros become static inline functions defined in the C headers. Nothing changes in the C struct itself.
Agreed. My point is that at the level of the C code itself there *is* a function Py_SET_TYPE(obj, type) that needs to be called and one can't just write "Py_TYPE(obj) = type" which effectively limits how setting a type works to a very particular implementation.
Yes, this change by itself doesn't make PyType completely opaque, but that is the direction that Victor is trying to head in I believe.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: vstinner@python.org
-- Night gathers, and now my watch begins. It shall not end until my death.
On 9/9/21 1:29 pm, Victor Stinner wrote:
Converting macros to static inline functions prepares the C API for opaque function calls, like: PyTypeObject* Py_TYPE(PyObject *op). In CPython, we can likely keep static inline functions (for best performance).
The advantage of using static inline functions is that other Python implementations have more freedom on how they implement Py_TYPE() and Py_SET_TYPE(). They don't have to implement Python objects exactly the same way than CPython does anymore.
Currently, it's simply impossible to implement the Python C API without copying CPython PyObject structure.
As long as these are static inline functions in CPython, PyPy will also implement these as static inline functions. Any time we have deviated from CPython without having a carefully designed interface (like HPy provides), we see that sooner or later it causes problems: either compatibility or performance suffers. If there are reasons for CPython to do this kind of refactoring, that is great, but in my opinion doing this neither helps nor hurts (other than more code churn) PyPy.
Matti
On Thu, Sep 9, 2021 at 12:30 PM Victor Stinner <vstinner@python.org> wrote:
We cannot evolve Python (fix bugs, optimize it) without this annoying work of fixing the C API: add more abstractions on top of C structures.
I agree with this and I think that it's the core of the problem. Currently, the C API leaks too many implementation details and makes it almost impossible to change anything in CPython.
The first thing to acknowledge is that CPython is stuck with design decisions which were taken in 1990 and that cannot be changed without breaking (some) backwards compatibility. That's a fact.
Another fact is that there is no single breaking change and/or abstraction which by itself enables big refactorings/experiments/optimizations, because many of them have multiple blockers. So if you take any breaking change on its own, it's never worthwhile because it breaks compatibility for no immediate gain. On the other hand, every such small change decreases a tiny bit of the technical debt which has been accumulated over the years.
IMHO, the correct question that the core CPython developers need to ask themselves is how much technical debt they are willing to carry over forever for the sake of backwards compatibility at all costs. I think that the tone of my email makes it clear what my personal opinion is :), but again this is something which should be decided by CPython developers.
On Thu, Sep 9, 2021 at 12:30 PM Victor Stinner <vstinner@python.org> wrote:
We cannot evolve Python (fix bugs, optimize it) without this annoying work of fixing the C API: add more abstractions on top of C structures.
I agree with that, but this argument works for *any* breaking change. For each change there should be an independent cost/benefit analysis, and this argument isn't helping there. I don't think "evolve Python" is specific enough to be a goal.
On 09. 09. 21 16:18, Antonio Cuni wrote:
I agree with this and I think that it's the core of the problem. Currently, the C API leaks too many implementation details and makes it almost impossible to change anything in CPython.
The first thing to acknowledge is that CPython is stuck with design decisions which were taken in 1990 and that cannot be changed without breaking (some) backwards compatibility. That's a fact.
Another fact is that there is no single breaking change and/or abstraction which by itself enables big refactorings/experiments/optimizations, because many of them have multiple blockers. So if you take any breaking change on its own, it's never worthwhile because it breaks compatibility for no immediate gain. On the other hand, every such small change decreases a tiny bit of the technical debt which has been accumulated over the years.
And on the third hand, every small change is disruptive. Perhaps in ways we don't hear about: people stick to old versions, abandon their projects, or switch to other languages.
IMHO, the correct question that the core CPython developers need to ask themselves is how much technical debt they are willing to carry over forever for the sake of backwards compatibility at all costs. I think that the tone of my email makes it clear what my personal opinion is :), but again this is something which should be decided by CPython developers.
It would also be fair if we could ask the projects that use the API, and see if the proposed improvements are actually worth the churn for them. For this particular change, I haven't actually seen the proposed improvements -- just vague ideas. Why should vague ideas justify the maintenance burden for existing, working projects?
Since this ML is not really on the radar of many extension writers, how about creating a C API change announcement list where breaking changes are posted or, even better:
Analyze the top PyPI packages, identify the ones which would break and send the maintainers an email explaining the change.
(This would be a great SoC project, perhaps for next year.)
The posts could then direct package authors to this list for additional discussion.
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Sep 09 2021)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/
On Thu, Sep 9, 2021 at 5:05 PM Marc-Andre Lemburg <mal@egenix.com> wrote:
Since this ML is not really on the radar of many extension writers, how about creating a C API change announcement list where breaking changes are posted or, even better:
Analyze the top PyPI packages, identify the ones which would break and send the maintainers an email explaining the change.
Sure, I did that whenever I could.
First, my conclusion: it's kind of sad that there is no "reliable" and "automated" way to rebuild *all* PyPI projects to count how many projects would be broken by an incompatible change, when working on a change. I do this work manually.
Because this task is manual, I'm more in favor of doing a quick estimate of the risk first, then push the change, and consider reverting if too many problems are discovered during the development phase.
I know that it's not perfect.
I also know that some people consider that the C API is perfect and that it must not be changed :-) But there are *reasons* to "fix" it :-)
--
I copied Inada-san tool to download PyPI top (5000) projects: https://github.com/vstinner/misc/blob/main/cpython/download_pypi_top.py
You can then use tools like "rg" to grep regex patterns directly in tarballs and ZIP archives. See Inada-san notes about that: https://github.com/methane/notes/tree/master/2020/wchar-cache
I usually use it as an estimation on how many projects will be impacted outside the PyPI most popular projects.
--
Moreover, at Red Hat, we are rebuilding the whole Fedora operating system (technically, we try to only rebuild Python packages) with newer Python during alpha and then beta releases.
We are usually the first ones to detect Python packages impacted by incompatible changes. That's how we decided to revert a few incompatible changes in Python 3.9 for example: https://lwn.net/Articles/811369/
That's also how we decided to revert the Py_TYPE() change in Python 3.10.
--
There is also the code search on GitHub, but it's not easy to skip the big number of copies of the CPython source code in results.
Victor
Night gathers, and now my watch begins. It shall not end until my death.
On Thu, Sep 9, 2021 at 5:06 PM Marc-Andre Lemburg <mal@egenix.com> wrote:
Analyze the top PyPI packages, identify the ones which would break and send the maintainers an email explaining the change.
for this particular issue, I have a repository which contains the C code in the top 4000 PyPI packages: https://github.com/hpyproject/top4000-pypi-packages
the repo is imperfect: to keep the size small it contains only the C code (thus you can't build the packages from there), I manually removed some duplicated/vendored/autogenerated files and it's just a snapshot of a certain point in time. However, it is very useful in combination with grep to have some idea of how often a certain API and/or pattern is used in the real world.
On 09. 09. 21 23:36, Antonio Cuni wrote:
On Thu, Sep 9, 2021 at 5:06 PM Marc-Andre Lemburg <mal@egenix.com> wrote:
Analyze the top PyPI packages, identify the ones which would break and send the maintainers an email explaining the change.
for this particular issue, I have a repository which contains the C code in the top 4000 PyPI packages: https://github.com/hpyproject/top4000-pypi-packages
the repo is imperfect: to keep the size small it contains only the C code (thus you can't build the packages from there), I manually removed some duplicated/vendored/autogenerated files and it's just a snapshot of a certain point in time. However, it is very useful in combination with grep to have some idea of how often a certain API and/or pattern is used in the real world.
Sadly, "top 4000 PyPI packages" is very biased towards widely-used projects and the public projects -- ones least likely to have issues, because the community (of widely-used projects) or the author of the CPython change (for public projects) can submit fixes early.
It would be sad if CPython focused only on successful, public projects; but I fear that's where we'll end up if we use this heuristic when evaluating changes.
On Fri, Sep 10, 2021 at 9:55 AM Petr Viktorin <encukou@gmail.com> wrote:
Sadly, "top 4000 PyPI packages" is very biased towards widely-used projects and the public projects -- ones least likely to have issues, because the community (of widely-used projects) or the author of the CPython change (for public projects) can submit fixes early.
I agree it's biased, but it's the only reasonable heuristic that we can use.
It would be sad if CPython focused only on successful, public projects;
but I fear that's where we'll end up if we use this heuristic when evaluating changes.
We can use an N higher than 4000 to account also for public-but-less-succesfull ones. For non public ones, honestly I am not even sure whether we should care: if they are private it's very likely that they are used in a business context, where they surely can spend the money to fix them. I think that the ability of evolving CPython and generally improve the broader Python ecosystem is more important than allowing some companies to save some bucks, but that's just my own personal opinion, of course.
Antonio
On 10. 09. 21 10:48, Antonio Cuni wrote:
On Fri, Sep 10, 2021 at 9:55 AM Petr Viktorin <encukou@gmail.com <mailto:encukou@gmail.com>> wrote:
Sadly, "top 4000 PyPI packages" is very biased towards widely-used projects and the public projects -- ones least likely to have issues, because the community (of widely-used projects) or the author of the CPython change (for public projects) can submit fixes early.
I agree it's biased, but it's the only reasonable heuristic that we can use.
It would be sad if CPython focused only on successful, public projects; but I fear that's where we'll end up if we use this heuristic when evaluating changes.
We can use an N higher than 4000 to account also for public-but-less-succesfull ones.
... And I forgot about public projects that aren't on PyPI. I'm biased towards Fedora, so I'll name GIMP plugins, Firefox build system, and (for C API specifically) Samba bindings, as examples.
For non public ones, honestly I am not even sure whether we should care: if they are private it's very likely that they are used in a business context, where they surely can spend the money to fix them.
I think I get where you're coming from, but I really don't think making Python more expensive to maintain is helpful. their path as a Python developer
- money/time can also be spent on rewriting things in another language, if that's more economic (and that works for a community of volunteers as well as for a company)
- eventually some human will still need to do the change, and question
- not all companies have money to spend. I don't know how you feel about cash-strapped startups or Python-friendly skunkworks teams in multinational companies. But anyway, if a company can "surely can spend the money", IMO it would be better if they spend it on improving the ecosystem in backwards-compatible ways.
I think that the ability of evolving CPython and generally improve the broader Python ecosystem is more important than allowing some companies to save some bucks, but that's just my own personal opinion, of course.
Sure, let's evolve and improve CPython. But can we make it in steps that are individually helpful?
(I work at Red Hat but the views in this mail are my own.)
On Mon, Sep 13, 2021 at 11:37 AM Petr Viktorin <encukou@gmail.com> wrote:
Sure, let's evolve and improve CPython. But can we make it in steps that are individually helpful?
I think that's the core of the divergences of opinions. I think we have reached a point where the amount of improvements which can be done by "small steps which are individually useful" is little. I might be wrong of course.
One solution would be to:
introduce Py_GetSizeInline(), Py_GetTypeInline() static inline functions (of course, feel free to prefer other names :-))
deprecate Py_SIZE() and Py_TYPE() with a message guiding towards Py_GetSizeInline() and Py_GetTypeInline() respectively. To be clear, this is a compile-time deprecation message using pragmas or other compiler-specific means (see e.g. https://stackoverflow.com/questions/2681259/how-to-deprecate-a-c-pre-process...).
in a couple versions, retire the current definitions of Py_SIZE and Py_TYPE and replace them with:
#define Py_SIZE Py_GetSizeInline #define Py_TYPE Py_GetTypeInline
Regards
Antoine.
Le 13/09/2021 à 16:52, Antonio Cuni a écrit :
On Mon, Sep 13, 2021 at 11:37 AM Petr Viktorin <encukou@gmail.com> wrote:
Sure, let's evolve and improve CPython. But can we make it in steps that are individually helpful?
I think that's the core of the divergences of opinions. I think we have reached a point where the amount of improvements which can be done by "small steps which are individually useful" is little. I might be wrong of course.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: antoine@python.org
Hi Antoine,
*Getting* an object type and an object size using Py_TYPE() and Py_SIZE() remains valid and is still supported. There is no problem with that.
Python 3.11 breaks the few C extensions which use Py_TYPE() and Py_SIZE() to *set* an object type or an object size: Py_SET_TYPE() and Py_SET_SIZE() must be used.
Py_SET_SIZE() and Py_SET_TYPE() exists for 2 Python releases (added to Python 3.9). Py_TYPE() requires to call Py_SET_TYPE() to set an object in Python 3.10 documentation (same for Py_SIZE): https://docs.python.org/3.10/c-api/structures.html#c.Py_TYPE
--
Deprecating Py_TYPE/Py_SIZE would affect way more C extensions, no? Requiring C extensions to replace Py_TYPE() with Py_GetTypeInline() and replacing Py_SIZE() with Py_GetSizeInline() would require more work and has no benefit, no? I'm not sure that I get your point. Is it about keeping Py_TYPE/Py_SIZE for a few more Python releases?
Victor
On Mon, Sep 13, 2021 at 5:23 PM Antoine Pitrou <antoine@python.org> wrote:
One solution would be to:
introduce Py_GetSizeInline(), Py_GetTypeInline() static inline functions (of course, feel free to prefer other names :-))
deprecate Py_SIZE() and Py_TYPE() with a message guiding towards Py_GetSizeInline() and Py_GetTypeInline() respectively. To be clear, this is a compile-time deprecation message using pragmas or other compiler-specific means (see e.g. https://stackoverflow.com/questions/2681259/how-to-deprecate-a-c-pre-process...).
in a couple versions, retire the current definitions of Py_SIZE and Py_TYPE and replace them with:
#define Py_SIZE Py_GetSizeInline #define Py_TYPE Py_GetTypeInline
Regards
Antoine.
Le 13/09/2021 à 16:52, Antonio Cuni a écrit :
On Mon, Sep 13, 2021 at 11:37 AM Petr Viktorin <encukou@gmail.com> wrote:
Sure, let's evolve and improve CPython. But can we make it in steps that are individually helpful?
I think that's the core of the divergences of opinions. I think we have reached a point where the amount of improvements which can be done by "small steps which are individually useful" is little. I might be wrong of course.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: antoine@python.org
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: vstinner@python.org
-- Night gathers, and now my watch begins. It shall not end until my death.
Le 13/09/2021 à 17:39, Victor Stinner a écrit :
Deprecating Py_TYPE/Py_SIZE would affect way more C extensions, no?
What do you mean "affect"? A deprecation warning does not break anything, but it signals the developer that something should perhaps be done.
Requiring C extensions to replace Py_TYPE() with Py_GetTypeInline() and replacing Py_SIZE() with Py_GetSizeInline() would require more work and has no benefit, no?
Please read what I proposed. There is no *requirement* to replace them explicitly, as the replacement will be done implicitly in a few Python releases. There is an explicit *warning* telling users that their current usage may be wrong if they use Py_TYPE or Py_SIZE with assignment.
Regards
Antoine.
I'm not sure that I get your point. Is it
about keeping Py_TYPE/Py_SIZE for a few more Python releases?
Victor
On Mon, Sep 13, 2021 at 5:23 PM Antoine Pitrou <antoine@python.org> wrote:
One solution would be to:
introduce Py_GetSizeInline(), Py_GetTypeInline() static inline functions (of course, feel free to prefer other names :-))
deprecate Py_SIZE() and Py_TYPE() with a message guiding towards Py_GetSizeInline() and Py_GetTypeInline() respectively. To be clear, this is a compile-time deprecation message using pragmas or other compiler-specific means (see e.g. https://stackoverflow.com/questions/2681259/how-to-deprecate-a-c-pre-process...).
in a couple versions, retire the current definitions of Py_SIZE and Py_TYPE and replace them with:
#define Py_SIZE Py_GetSizeInline #define Py_TYPE Py_GetTypeInline
Regards
Antoine.
Le 13/09/2021 à 16:52, Antonio Cuni a écrit :
On Mon, Sep 13, 2021 at 11:37 AM Petr Viktorin <encukou@gmail.com> wrote:
Sure, let's evolve and improve CPython. But can we make it in steps that are individually helpful?
I think that's the core of the divergences of opinions. I think we have reached a point where the amount of improvements which can be done by "small steps which are individually useful" is little. I might be wrong of course.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: antoine@python.org
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: vstinner@python.org
On 09.09.2021 16:18, Antonio Cuni wrote:
On Thu, Sep 9, 2021 at 12:30 PM Victor Stinner <vstinner@python.org> wrote:
We cannot evolve Python (fix bugs, optimize it) without this annoying work of fixing the C API: add more abstractions on top of C structures.
I agree with this and I think that it's the core of the problem. Currently, the C API leaks too many implementation details and makes it almost impossible to change anything in CPython.
The first thing to acknowledge is that CPython is stuck with design decisions which were taken in 1990 and that cannot be changed without breaking (some) backwards compatibility. That's a fact.
Another fact is that there is no single breaking change and/or abstraction which by itself enables big refactorings/experiments/optimizations, because many of them have multiple blockers. So if you take any breaking change on its own, it's never worthwhile because it breaks compatibility for no immediate gain. On the other hand, every such small change decreases a tiny bit of the technical debt which has been accumulated over the years.
IMHO, the correct question that the core CPython developers need to ask themselves is how much technical debt they are willing to carry over forever for the sake of backwards compatibility at all costs. I think that the tone of my email makes it clear what my personal opinion is :), but again this is something which should be decided by CPython developers.
I think there's overall agreement that the C API needs to be tweaked to make it easier to add optimizations to the internals. I also think there's little push back against the particular change which triggered the discussion - getters/setters do make things easier.
Where I see need for discussion is the strategy on how this should be applied. Some possibilities:
a) Many small changes throughout the next few years, each release with a certain smaller number of them b) One larger change every few releases c) One huge change for Python 4.0
The problem with C API changes is that extensions need to be adapted. If that has to be done for every single Python release, this causes a lot of churn. It may also cause issues for packages to support multiple Python versions (e.g. PyData packages typically support 3.7 - 3.9). Victor's package can help with the latter, but it doesn't solve the former issue.
This is something the SC should provide some guidance on, since the current approach (which is close to a)) makes life hard for the community.
I'm also missing an overarching strategy of where the C API should be heading from the SC in the coming years, so that everyone is on the same page.
I put the SC on CC to get this on their radar.
Related to all this, since it's come up often in these change discussions:
If we want to officially push for Cython as a way to provide a layer between the Python C API and Python C extensions, the PSF needs to coordinate some serious and sustained funding for that project. Otherwise, we end up with a single point of failure in our extension stack, which isn't even under SC control.
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Sep 09 2021)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/
On Thu, Sep 9, 2021 at 4:59 PM Marc-Andre Lemburg <mal@egenix.com> wrote:
The problem with C API changes is that extensions need to be adapted. If that has to be done for every single Python release, this causes a lot of churn.
Minor C API incompatible changes have already been done in Python 3.7, 3.8 and 3.9. I'm keeping an eye on the Python bug tracker, capi-sig and python-dev lists, and I only saw very complains. Usually, it's just one project, and I help them to fix their issue.
I reverted one or two C API changes (like Py_TYPE, reverted in 3.10) which caused too many issues. I'm always volunteer to help projects to help them to adapt to C API changes.
Only a *minority* of C extensions are impacted by these C API changes.
So far, only a few C extensions had to be adapted, and this work has been done. But maybe you are exaggerating the number of impacted projects?
It may also cause issues for packages to support multiple Python versions (e.g. PyData packages typically support 3.7 - 3.9). Victor's package can help with the latter, but it doesn't solve the former issue.
All C extensions that I saw support a large number of Python versions, usually Python 3.6-3.9, up to 2.7-3.11.
I'm not sure of which technical issue you're talking about. If you want to get new C API functions on old Python versions, use pythoncapi_compat.h that I talked about.
Cython, numpy and pybind11 chose to have their own compatibility layer. They are doing that for many years, it's not something else. Obviously, if you use Cython or pybind11, you don't have to handle differences between the different Python versions.
If we want to officially push for Cython as a way to provide a layer between the Python C API and Python C extensions, the PSF needs to coordinate some serious and sustained funding for that project. Otherwise, we end up with a single point of failure in our extension stack, which isn't even under SC control.
Cython is already recommended for many years directly in the official C API documentation, aside with "cffi, SWIG and Numba": https://docs.python.org/dev/extending/index.html?highlight=cython#recommende...
Maybe it should be advertized in more C API pages.
I agree that Cython lacks funding, but that's a different topic, no?
At least, I'm trying to prepare Cython for incompatible C API changes *before* pushing them into Python. Contributions is a help to help Cython. My team at Red Hat helped to fix multiple compatibility issues in Cython caused by changes in new Python versions.
Victor
Night gathers, and now my watch begins. It shall not end until my death.
On 08. 09. 21 17:21, Victor Stinner wrote:
On Wed, Sep 8, 2021 at 3:49 PM Petr Viktorin <encukou@gmail.com> wrote:
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
Why is this needed?
Read the issue for the rationale: https://bugs.python.org/issue39573#msg361513
The reason I asked is that this formally needs SC exception to the backwards compatibility policy. I can apply for the exception if you don't want to, but I'd like to give some good arguments for it -- and while I agree with most of what's in these documents, I don't think making this particular change now is good.
To summarize how I feel: third-party projects.
- CPython and third-party projects can port to Py_SET_TYPE now.
- Experiments (like tagged pointers) can be tried with only the ported
- This is exactly the kind of mandatory code churn that makes Python look bad for users that wrote an extension and want it to work.
- I can see no downside to changing this later, when it brings actual benefit for users.
But, FWIW, I'd support making the change in the 3.11 limited API: if a third-party project changes the Py_LIMITED_API #define, they should be able to modernize their code as well. Same for Py_BUILD_CORE -- we can be stricter in code we control.
On Tue, Oct 5, 2021 at 12:40 PM Petr Viktorin <encukou@gmail.com> wrote:
The reason I asked is that this formally needs SC exception to the backwards compatibility policy. I can apply for the exception if you don't want to, but I'd like to give some good arguments for it -- and while I agree with most of what's in these documents, I don't think making this particular change now is good.
To summarize how I feel: third-party projects.
- CPython and third-party projects can port to Py_SET_TYPE now.
- Experiments (like tagged pointers) can be tried with only the ported
It's not about experimental things. It's about *preparing* the C API to completely abstract access to Python objects, to prevent Python implementations other than CPython to have to emulate the whole CPython implementation as PyPy cpyext does. Emulating CPython implementation is inefficient.
CPython should also benefit from these changes. Today, CPython is stuck at design choices made 30 years ago. If the C API fully abstract access to Python objects, as the HPy project does, it becomes possible to make significant optimizations in CPython (without losing support for 3rd party C extensions).
In short, this change is required to optimize CPython.
I gave more details about these problems in two PEPs:
https://www.python.org/dev/peps/pep-0620/ and https://mail.python.org/archives/list/python-dev@python.org/thread/RA7Q4JAUE...
Victor
- This is exactly the kind of mandatory code churn that makes Python look bad for users that wrote an extension and want it to work.
- I can see no downside to changing this later, when it brings actual benefit for users.
But, FWIW, I'd support making the change in the 3.11 limited API: if a third-party project changes the Py_LIMITED_API #define, they should be able to modernize their code as well. Same for Py_BUILD_CORE -- we can be stricter in code we control.
-- Night gathers, and now my watch begins. It shall not end until my death.
On 5 Oct 2021, at 12:40, Petr Viktorin <encukou@gmail.com> wrote:
On 08. 09. 21 17:21, Victor Stinner wrote:
On Wed, Sep 8, 2021 at 3:49 PM Petr Viktorin <encukou@gmail.com> wrote:
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
Why is this needed? Read the issue for the rationale: https://bugs.python.org/issue39573#msg361513 See also: https://www.python.org/dev/peps/pep-0620/
The reason I asked is that this formally needs SC exception to the backwards compatibility policy. I can apply for the exception if you don't want to, but I'd like to give some good arguments for it -- and while I agree with most of what's in these documents, I don't think making this particular change now is good.
To summarize how I feel:
- CPython and third-party projects can port to Py_SET_TYPE now.
- Experiments (like tagged pointers) can be tried with only the ported third-party projects.
A problem with Py_SET_TYPE is that it has no way to report errors, which means it is not compatible with tagged pointers.
IMHO future proof API should always have a way to report errors (for example through a return value), even if the current CPython implementation will always succed.
Btw. I don’t particularly mind minor changes like the introduction of Py_SET_TYPE because adopting these is can almost be done mechanically.
Larger API changes, like the (needed) deprecation and later removal of the old buffer API, can be more problematic because they can require significant work.
Ronald —
Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/
On Tue, Oct 5, 2021 at 2:26 PM Ronald Oussoren <ronaldoussoren@mac.com> wrote:
A problem with Py_SET_TYPE is that it has no way to report errors, which means it is not compatible with tagged pointers.
IMHO future proof API should always have a way to report errors (for example through a return value), even if the current CPython implementation will always succed.
Btw. I don’t particularly mind minor changes like the introduction of Py_SET_TYPE because adopting these is can almost be done mechanically.
Py_SET_TYPE() is weird workaround for limited C compilers (Windows MSC) which don't support referencing a base type using "&PyLong_Type" syntax when defining a type statically in C (".tp_base = &PyLong_Type,").
If you only define heap types, it's not needed.
Py_SET_REFCNT() and Py_SET_TYPE() should not be part of the limited C API. I chose to add them anyway because it was possible to access directly PyObject.ob_refcnt and PyObject.ob_type in Python 3.9 and older. I'm not sure about Py_SET_SIZE() and the limited C API. Again, I added it since it was already possible to access PyVarObject.ob_size before.
Victor
Night gathers, and now my watch begins. It shall not end until my death.
On 5 Oct 2021, at 17:07, Victor Stinner <vstinner@python.org> wrote:
On Tue, Oct 5, 2021 at 2:26 PM Ronald Oussoren <ronaldoussoren@mac.com> wrote:
A problem with Py_SET_TYPE is that it has no way to report errors, which means it is not compatible with tagged pointers.
IMHO future proof API should always have a way to report errors (for example through a return value), even if the current CPython implementation will always succed.
Btw. I don’t particularly mind minor changes like the introduction of Py_SET_TYPE because adopting these is can almost be done mechanically.
Py_SET_TYPE() is weird workaround for limited C compilers (Windows MSC) which don't support referencing a base type using "&PyLong_Type" syntax when defining a type statically in C (".tp_base = &PyLong_Type,").
If you only define heap types, it's not needed.
Note quite ;-). I have some calls in PyObjC that dynamically change the type of an object. One changes the type of a python proxy for an Objective-C object when the type of that object change (which can happen in ObjC).
Another one looks a lot more dodgy as it changes the type of an instance of a builtin type, I’ve made a note to look into that. I guess that’s a clear argument in favour of the struct-hiding work you’ve been doing. You never know what strange things users do when they can poke directly into implementation details.
Py_SET_REFCNT() and Py_SET_TYPE() should not be part of the limited C API. I chose to add them anyway because it was possible to access directly PyObject.ob_refcnt and PyObject.ob_type in Python 3.9 and older. I'm not sure about Py_SET_SIZE() and the limited C API. Again, I added it since it was already possible to access PyVarObject.ob_size before.
I agree with adding those macros, I never particularly liked “Py_TYPE(obj) = value”. The new API looks cleaner to me, and adding APIs that lead to cleaner code, or are harder to misuse is likely a net positive.
Ronald
Victor
Night gathers, and now my watch begins. It shall not end until my death.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: ronaldoussoren@mac.com
—
Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/
Using Py_SET_TYPE() to change a type base class is not correct. You should set the __class__ attribute instead using PyObject_SetAttr(). The implementation is way more complex!
Victor
On Tue, Oct 5, 2021 at 5:38 PM Ronald Oussoren <ronaldoussoren@mac.com> wrote:
On 5 Oct 2021, at 17:07, Victor Stinner <vstinner@python.org> wrote:
On Tue, Oct 5, 2021 at 2:26 PM Ronald Oussoren <ronaldoussoren@mac.com> wrote:
A problem with Py_SET_TYPE is that it has no way to report errors, which means it is not compatible with tagged pointers.
IMHO future proof API should always have a way to report errors (for example through a return value), even if the current CPython implementation will always succed.
Btw. I don’t particularly mind minor changes like the introduction of Py_SET_TYPE because adopting these is can almost be done mechanically.
Py_SET_TYPE() is weird workaround for limited C compilers (Windows MSC) which don't support referencing a base type using "&PyLong_Type" syntax when defining a type statically in C (".tp_base = &PyLong_Type,").
If you only define heap types, it's not needed.
Note quite ;-). I have some calls in PyObjC that dynamically change the type of an object. One changes the type of a python proxy for an Objective-C object when the type of that object change (which can happen in ObjC).
Another one looks a lot more dodgy as it changes the type of an instance of a builtin type, I’ve made a note to look into that. I guess that’s a clear argument in favour of the struct-hiding work you’ve been doing. You never know what strange things users do when they can poke directly into implementation details.
Py_SET_REFCNT() and Py_SET_TYPE() should not be part of the limited C API. I chose to add them anyway because it was possible to access directly PyObject.ob_refcnt and PyObject.ob_type in Python 3.9 and older. I'm not sure about Py_SET_SIZE() and the limited C API. Again, I added it since it was already possible to access PyVarObject.ob_size before.
I agree with adding those macros, I never particularly liked “Py_TYPE(obj) = value”. The new API looks cleaner to me, and adding APIs that lead to cleaner code, or are harder to misuse is likely a net positive.
Ronald
Victor
Night gathers, and now my watch begins. It shall not end until my death.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: ronaldoussoren@mac.com
—
Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/
-- Night gathers, and now my watch begins. It shall not end until my death.
On 5 Oct 2021, at 18:39, Victor Stinner <vstinner@python.org> wrote:
Using Py_SET_TYPE() to change a type base class is not correct. You should set the __class__ attribute instead using PyObject_SetAttr(). The implementation is way more complex!
I agree that in general you should call PyObject_SetAttr().
That said, I think I’m fine here, other than a change in 3.9 where instances of heap types now own a reference to their type. The scenario where I do this are very specific, and because of that none of the guards in object_set_class would trigger. I won’t bore you with the details.
This is code that has worked without problems for way over a decade. I might change PyObjC anyway, the code in question should not be performance critical and I’m slowly cleaning up code that’s too magical for its own good.
The other instance of Py_SET_TYPE in PyObjC is more problematic and depends too much on implementation details of CPython. Annoyingly that’s also a use were calling PyObject_SetAttr won’t work. Coming up with a fix for that will be interesting :-).
One of these days I should write up where PyObjC peeks and pokes into non-public APIs, and what functionality would help to avoid that. I’d love to end up in a future where I could use the stable ABI for PyObjC, although that won’t be easy.
Ronald
Victor
On Tue, Oct 5, 2021 at 5:38 PM Ronald Oussoren <ronaldoussoren@mac.com> wrote:
On 5 Oct 2021, at 17:07, Victor Stinner <vstinner@python.org> wrote:
On Tue, Oct 5, 2021 at 2:26 PM Ronald Oussoren <ronaldoussoren@mac.com> wrote:
A problem with Py_SET_TYPE is that it has no way to report errors, which means it is not compatible with tagged pointers.
IMHO future proof API should always have a way to report errors (for example through a return value), even if the current CPython implementation will always succed.
Btw. I don’t particularly mind minor changes like the introduction of Py_SET_TYPE because adopting these is can almost be done mechanically.
Py_SET_TYPE() is weird workaround for limited C compilers (Windows MSC) which don't support referencing a base type using "&PyLong_Type" syntax when defining a type statically in C (".tp_base = &PyLong_Type,").
If you only define heap types, it's not needed.
Note quite ;-). I have some calls in PyObjC that dynamically change the type of an object. One changes the type of a python proxy for an Objective-C object when the type of that object change (which can happen in ObjC).
Another one looks a lot more dodgy as it changes the type of an instance of a builtin type, I’ve made a note to look into that. I guess that’s a clear argument in favour of the struct-hiding work you’ve been doing. You never know what strange things users do when they can poke directly into implementation details.
Py_SET_REFCNT() and Py_SET_TYPE() should not be part of the limited C API. I chose to add them anyway because it was possible to access directly PyObject.ob_refcnt and PyObject.ob_type in Python 3.9 and older. I'm not sure about Py_SET_SIZE() and the limited C API. Again, I added it since it was already possible to access PyVarObject.ob_size before.
I agree with adding those macros, I never particularly liked “Py_TYPE(obj) = value”. The new API looks cleaner to me, and adding APIs that lead to cleaner code, or are harder to misuse is likely a net positive.
Ronald
Victor
Night gathers, and now my watch begins. It shall not end until my death.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: ronaldoussoren@mac.com
—
Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/
-- Night gathers, and now my watch begins. It shall not end until my death.
capi-sig mailing list -- capi-sig@python.org To unsubscribe send an email to capi-sig-leave@python.org https://mail.python.org/mailman3/lists/capi-sig.python.org/ Member address: ronaldoussoren@mac.com
—
Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/
It gets difficult for me to write my essay on time, then take assistance from the most reliable and trusted website, that is, Global Assignment Help.
Hi,
I wrote an article about the C API changes related to the PyObject structure: https://vstinner.github.io/c-api-abstract-pyobject.html
It elaborated the rationale for making these changes, including the incompatible C API changes. In short, the PyObject structure prevents to optimize Python.
Victor
On Wed, Sep 8, 2021 at 1:49 PM Victor Stinner <vstinner@python.org> wrote:
Hi,
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
You can use the upgrade_pythoncapi.py script of my pythoncapi_compat project which does these changes for you: you just have to copy pythoncapi_compat.h to your project. This header file provides new C API functions like Py_NewRef() and Py_SET_TYPE() to old Python versions, Python 2.7-3.11.
=> https://github.com/pythoncapi/pythoncapi_compat
I already converted Py_TYPE() and Py_SIZE() macros in Python 3.10, but it broke too many C extensions and so I had to revert the change. In the meanwhile, I updated many C extensions and created the pythoncapi_compat project. For example, Cython and numpy have been updated to use Py_SET_TYPE() and Py_SET_SIZE(). Mercurial and immutables projects now use pythoncapi_compat.
I'm interested by feedback on my pythoncapi_compat project ;-)
Tell me if you need help to update your project for Python 3.11 C API changes: https://docs.python.org/dev/whatsnew/3.11.html#c-api-changes
Victor
Night gathers, and now my watch begins. It shall not end until my death.
-- Night gathers, and now my watch begins. It shall not end until my death.
FYI the Steering Council gave a PEP 387 exception for these changes in Python 3.11: https://github.com/python/steering-council/issues/79
Victor
On Wed, Sep 8, 2021 at 1:49 PM Victor Stinner <vstinner@python.org> wrote:
Hi,
I converted Py_TYPE() and Py_SIZE() macros to static inline functions in the future Python 3.11. It's a backward incompatible change. For example, "Py_TYPE(obj) = type;" must be replaced with "Py_SET_TYPE(obj, type);".
You can use the upgrade_pythoncapi.py script of my pythoncapi_compat project which does these changes for you: you just have to copy pythoncapi_compat.h to your project. This header file provides new C API functions like Py_NewRef() and Py_SET_TYPE() to old Python versions, Python 2.7-3.11.
=> https://github.com/pythoncapi/pythoncapi_compat
I already converted Py_TYPE() and Py_SIZE() macros in Python 3.10, but it broke too many C extensions and so I had to revert the change. In the meanwhile, I updated many C extensions and created the pythoncapi_compat project. For example, Cython and numpy have been updated to use Py_SET_TYPE() and Py_SET_SIZE(). Mercurial and immutables projects now use pythoncapi_compat.
I'm interested by feedback on my pythoncapi_compat project ;-)
Tell me if you need help to update your project for Python 3.11 C API changes: https://docs.python.org/dev/whatsnew/3.11.html#c-api-changes
Victor
Night gathers, and now my watch begins. It shall not end until my death.
-- Night gathers, and now my watch begins. It shall not end until my death.
On 29. 11. 21 10:12, Victor Stinner wrote:
FYI the Steering Council gave a PEP 387 exception for these changes in Python 3.11: https://github.com/python/steering-council/issues/79
Indeed, the Py_TYPE change is approved. I assume the same applies to Py_SIZE and Py_REFCNT.
The SC says that "in general the deprecation notice needs to stay for at least two releases in the documentation." I don't think there currently is one in the docs, so I wonder if that means that one should be added now and stay until 3.13, or if it should have been added in 3.9. Or does the "in general" mean this note is only for other changes like this? I'll ask for clarification if we don't find an answer on this list.
I also doubt there can be disruption in the community before the first beta, since all projects that test with alphas already tested this in 3.10 alpha (and there was enough pushback to get it reverted then). But perhaps with the better communication this point will be relevant.
participants (9)
-
Antoine Pitrou
-
Antonio Cuni
-
johngillis984@gmail.com
-
Marc-Andre Lemburg
-
Matti Picus
-
Petr Viktorin
-
Ronald Oussoren
-
Simon Cross
-
Victor Stinner