Update on the limited C API

Hi,
I pushed a few changes to hide even more implementation details from the limited C API of CPython. See "Changes in the limited C API" documentation at:
https://docs.python.org/dev/whatsnew/3.9.html#build-and-c-api-changes
I'm interesting to get your feedback on such changes.
I know that the PyQt project uses the limited C API. Apart from that, I don't know other C extension using it.
One idea would be to consume it directly in CPython for a bunch of C extensions. It would help to check if the API is complete enough for a non-trivial extension. It should also help to detect issues earlier. Currently, CPython only uses the limited C API the "xxlimited" which is basically a test module only written to show how this API can be used.
By the way, I just opened "Make PyObject an opaque structure in the limited C API" issue:
https://bugs.python.org/issue39573
It's a follow-up on Neil Schemenauer's proof-of-concept of tagged pointer in CPython that he wrote in September 2018.
Victor
Night gathers, and now my watch begins. It shall not end until my death.

On 2020-02-07 02:37, Victor Stinner wrote:
The limited API is the PEP 384 stable ABI. Breaking it breaks the explicit promise that its functions will be available in any later Python 3.x.
For example, one of your reasons says "these macros didn’t work with the limited API which cannot access PyThreadState.recursion_depth field." That's false: the macro was part of the limited API, so it needs to continue to work. Making PyThreadState fully opaque would be great, yes, but it can't be done in the limited API which included a public macro that touches the internals.
Maybe it is time to start working on the 4.x stable ABI, and do breaking changes there? IMO it would make more sense to start early than to design the whole thing inside the 4.0 development period, anyway.
+1
Again, this is completely against the concept of a stable ABI. PEP 386 explicitly says PyObject (ob_refcnt, ob_type) shall be accessible.

Le 11/02/2020 à 10:39, Petr Viktorin a écrit :
This sounds like a good idea to me. The current stable ABI was designed "in the dark" and it was exercised only very late, meaning there are probably lots of tiny decisions that appear now to be undesirable. Designing a new "stable ABI" would allow learning from the mistakes made.
Regards
Antoine.

On Tue, Feb 11, 2020 at 3:16 AM Antoine Pitrou <antoine@python.org> wrote:
I agree that it's probably time to start looking at stable ABI 4. Maybe that's HPy? Either way we seem to consistently have momentum on working on improving the stable ABI so now might be the time to figure out what ABI4 should be and get people to start moving/prepping for it by e.g. avoiding poking into structs in ABI3.

On Feb 27, 2020, at 10:35, Brett Cannon <brett@python.org> wrote:
Doubling down, maybe ABI 4 should be focused on reducing the need for extension authors to even write C code. IOW, a new API that would provide the services for Cython-like tool would consume. The vision would be that most extension authors could just write some type-hinted flavor of standard Python and this tool would generate the C for the extension module.
-Barry

On 27Feb2020 1940, Barry Warsaw wrote:
Most extension authors also need to write C code to interact with C modules - speedups are the minority and native lib wrappers are the majority, last time I counted. So that "standard" Python would also have to support native calls, and ideally in a less complicated way than via ctypes :)
I would vote for being very anti-optimisation in any new stable ABI, though. Import, __getattr__ and __call__ are the only Python operations you need to use any Python object, or to provide to be usable by any Python object. If we focus on those as the actual "channel" between runtime and extension, we can move a lot of boilerplate statically into the extension, where it is then safe from future changes in the runtime.
I also think there's a possibility to reintroduce optimisations through optional native interfaces, similar to how QueryInterface works in COM (e.g. you could ask an object for its Mapping interface, and if it gives you one, use it directly - otherwise you __getattr__ for "__getitem__" and go from there -- and all of this can bundle up inside the static boilerplate that gets compiled into the extension).
So you'd periodically want to recompile extensions to take advantage of any updates to the helper library, but fundamentally we've only got primitive-value-to-Python conversions, import, getattr and calls to worry about (not) breaking.
And you wouldn't get an automatic speed improvement in interop code just from compiling it, but I'm okay with that. If you can't get enough of a speedup in your actual native code to overcome the cost of interop, then you should stick with your already-working and far more portable Python code :)
Cheers, Steve

On Thu, Feb 27, 2020 at 11:59 AM Steve Dower <steve.dower@python.org> wrote:
I agree. There's a layering here of providing a C FFI that Cython and any such tool would compile down to. Plus embedding scenarios I think mean we can't get away from providing some C FFI.
I would vote for being very anti-optimisation in any new stable ABI, though.
I do think that we have all learned that simpler is better when it comes to a stable ABI and we would naturally trend towards that anyway. :)
But I think we all need to agree we want to work on a new stable ABI before we start trying to design it. ;)
-Brett

By removing the C API in the way you suggest you lose the possibility to extend Python itself natively.
The C API is not just about providing wrappers to other C libs, it's also about being able to add new functionality to Python itself, e.g. new data types, fast implementations of algorithms working directly on Python types, etc.
For wrappers there are a lot of different tools available and as long as the interface is high level, they are mostly fine. However, even when it comes to wrappers, there are low level cases where the data transfer becomes the bottleneck and Python would lose big time if we'd remove the more direct C interaction.
pyOpenSSL, for example, got more than two times slower when it switched to using the cffi interface, instead of going directly via the Python C interface to the OpenSSL C API. And this is typical, since FFI interfaces require a lot of unneeded copying of data in both directions, whereas the Python API can work directly on the raw data.
For a new ABI design, it would be better to stay at the C level, but make the APIs a little more consistent again. Keeping in mind that some details may better be hidden away.
BTW: People seem to forget that Python does have a few levels of abstraction in the C API:
- high: https://docs.python.org/3/c-api/veryhigh.html
- medium: https://docs.python.org/3/c-api/abstract.html
- low: all the rest
There more dimensions to the Python API and ABI. The latter is mostly about linking and distributing binaries. I would argue that recompiling for a new Python version is not really all that hard anymore (it was in the days when Martin proposed the stable ABI, mostly because Windows compilers were not readily available), so the concept of a stable ABI is a bit outdated.
I wouldn't really put much effort into maintaining binary compatibility anymore. For better maintenance, portability and consistency of the API is much more important.
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Feb 27 2020)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/

On Thu, Feb 27, 2020 at 1:18 PM M.-A. Lemburg <mal@egenix.com> wrote:
Except I hear from people who find it burdensome to have to rebuild their wheels for every version of Python that they support, especially with a shift to an annual release cadence for Python. So while consistency and portability are definitely important, I don't think we should ignore binary compatibility either.

On 27.02.2020 22:38, Brett Cannon wrote:
I'm not suggesting to ignore them, but today they are much less of a problem - with all the CI/CD going on, MS compilers being readily available, a lot of this can be automated.
I think it would be better to put up a service similar to https://www.piwheels.org/ for PyPI packages and a set of standard platforms to address concerns about having to recompile packages for new releases.
The PSF would certainly be able to fund a project like this. Ben pulled this off with the help of Dave and they did a pretty amazing job at working out many of the details for piwheels.
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Feb 27 2020)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/

On 27Feb2020 2149, M.-A. Lemburg wrote:
The problem is that you also need to monitor, debug, and fix any packages that don't build properly. And it turns out, the best people to do this are the project maintainers. And they already have access to huge amounts of free CI time on GitHub, Azure Pipelines, Travis, etc. The better way is probably to offer to pay the projects directly if they have up-to-date and working wheels for "all" platforms.
I'm still very much in favour of having an easy-to-use and reliable stable ABI that saves projects from the trouble and also enables us to innovate on the runtime.
I'm very against anything that forces end-users to install a compiler. Based on the feedback I get about Windows today, this is far too big a burden.
Cheers, Steve

On 27/2/20 11:49 pm, M.-A. Lemburg wrote:
While this is off-topic, I cannot let the misconception that piwheels has solved all the problems go unanswered. There are endless problems caused by subtle changes in c interfaces, third-party libraries, and new python versions. One example: the numerous issues numpy has received because the piwheels people did not package an openblas implementation but relied on the distribution one, which was not always there https://github.com/numpy/numpy/issues?utf8=%E2%9C%93&q=is%3Aissue+raspberrypi. Automating the packaging story is definitely not a solved problem.
Matti

Brett Cannon schrieb am 27.02.20 um 22:38:
My experience (both personal and from what I hear) is that it's excessively more burdensome to add a new target platform (MacOS, Windows, …) than to add a new Python version to an existing build.
Once a platform is supported, adding a new Python version is just adding a new version entry to some kind of build script. With a couple of fixes every couple of years when deprecated features (Python or C-API) change and/or go away.
But to add a new platform, you often have to learn and create a completely new build setup, and sometimes even add a new build farm to your release process, which can take anything from days to weeks to months to get running nicely. Plus additional maintenance from time to time that involves remote debugging issues that you cannot reproduce on your local development platform. And that multiplies if you have non-trivial library dependencies that need building as well.
As long as PyPI is happy to load itself with lots of different wheels for each package release, and as long as there is a need (or use case) for binary wheels at all, I personally consider a stable ABI mostly a nice goodie for some, but nothing that solves the release problem for the bulk of the extension maintainers.
Stefan

On Thu, Feb 27, 2020 at 2:18 PM M.-A. Lemburg <mal@egenix.com> wrote:
That's fair. This really needs to be a data-driven process if we want the best outcome. How can we get enough data for that? Anecdotes like this are quite helpful, but still insufficient.
That's certainly the root of this discussion. :) We need to reach an informed consensus about how limited we can make the public C-API for extension authors.
It would probably make sense to have a stronger distinction for a public API for embedders (Include/Python-embedded.h?).
FWIW, I'm still in favor of the guidance Steve suggested about "rings and layers", which has some relationship to these abstraction layers.
+1
It seems like a lot of this discussion is already about a new "public" API anyway. :)
-eric

On 27Feb2020 2201, Eric Snow wrote:
No, it's not fair. FFI interfaces are terrible, and specifically they're terribl[y slow] at marshalling primitive values back and forth. That's not the same as distinguishing between a highly stable API and a version-by-version API.
Right. For that, we need actual designs and scenarios. Performance isn't the highest concern when trying to draw the line around certain stability guarantees we want to make.
It would probably make sense to have a stronger distinction for a public API for embedders (Include/Python-embedded.h?).
It's a side topic (that I have spent *many* hours thinking about), but I'd say embedders don't need as stable an API - they need easier ways to bundle the CPython build into their app so they can carry the version they want with them.
It seems like a lot of this discussion is already about a new "public" API anyway. :)
Only because we assume we'll get to design the new limited C API from the ground up. I don't think we've necessarily assumed we'll get to redesign the entire API, but we may be able to offer one that will be supported cross-versions that doesn't look like the current one (a supported PySIP library, if you like).
Cheers, Steve

I guess the 80/20 rule applies here, 80% of C extensions could probably use the stable ABI or an FFI interface and the rest can continue to use the version-specific API. That certainly applies to most extensions I’ve written, most of them are not performance sensitive and just expose preexisting native libraries.
Ronald

Antoine Pitrou schrieb am 11.02.20 um 12:15:
Just since you mentioned exercising the API, there's currently work being done by Eddie Elizondo et al. in Cython's master (pre-3.0) branch to enable C compile time support for the limited API.
https://github.com/cython/cython/issues/2542
You basically just pass "-DCYTHON_LIMITED_API" in your CFLAGS and the generated C code will restrict itself to that.
There are still some open issues, and we'll see what else we stumble into along the way. But most features are already supported and overall it seems to be getting there, slowly.
Apart from that, I'm happy to see ideas for a new and different C-API taking shape. We should definitely try to get some broader testing in the wild for those in an earlier state. That's the only way to realistically assess their usability, safety, performance … aspects, before making them an official part of a Python runtime (and hopefully more than one).
Stefan

Le mar. 11 févr. 2020 à 10:39, Petr Viktorin <encukou@gmail.com> a écrit :
I don't get your point. Since the limited C API exists, PyThreadState was always opaque. Extract of Python 3.6 pystate.h:
#ifdef Py_LIMITED_API typedef struct _ts PyThreadState; #else typedef struct _ts { struct _ts *prev; struct _ts *next; PyInterpreterState *interp; ... } PyThreadState; #endif
In Python 3.9, I fixed Py_EnterRecursiveCall() and Py_LeaveRecursiveCall() so they can be used in the limited C API. I tested: they didn't work before my change. It's not a recent regression.
My goal is to make the limited C API more complete to make it usable by more C extensions. The long term goal would be to only use it :-)
In my experience, the borders of the "stable ABI" remains blurry. For example, is PyGC_Head part of the stable ABI?
See "ABI breakage between Python 3.7.4 and 3.7.5: change in PyGC_Head structure": https://bugs.python.org/issue39599
For https://bugs.python.org/issue39573 : that's clearly a more ambitious goal than the initial design of the limited C API. It's a deliberate backward incompatible change. See the issue for the rationale.
The question is if we want this change, how it should be driven (propose a migration path), and how many projects using the limited C API would be broken if PyObject becomes opaque.
My early work on this issue is only to prepare this transition, without pushing any incompatible change, before a consensus is reached.
Victor
Night gathers, and now my watch begins. It shall not end until my death.

On 2020-02-11 12:54, Victor Stinner wrote:
Ah! I didn't realize code with the macros didn't compile! I misread the changelog entry. If it's not just me, would it be better to say the macros "failed to compile" instead of "didn’t work"/"never worked"?
Will old extensions compiled with PyObject_INIT() and PyObject_INIT_VAR() macros also continue to work?
Structs are not be part of the stable ABI by default: https://www.python.org/dev/peps/pep-0384/#structures
That was a regular ABI issue, not a stable ABI issue. Between 3.7.4 and 3.7.5, we need to avoid breaking the *entire* ABI, because extension module tags only include "37", not the micro version. A change like that should only be made in a x.y.0 release.
IMO it must not be done in the 3.x stable ABI, but it's a great candidate for the 4.0 one. That's my suggested migration path :)

Le mar. 11 févr. 2020 à 13:53, Petr Viktorin <encukou@gmail.com> a écrit :
I proposed https://github.com/python/cpython/pull/18461 to rephrase the What's New entry.
Will old extensions compiled with PyObject_INIT() and PyObject_INIT_VAR() macros also continue to work?
Yes.
Victor
Night gathers, and now my watch begins. It shall not end until my death.

On 2020-02-07 02:37, Victor Stinner wrote:
The limited API is the PEP 384 stable ABI. Breaking it breaks the explicit promise that its functions will be available in any later Python 3.x.
For example, one of your reasons says "these macros didn’t work with the limited API which cannot access PyThreadState.recursion_depth field." That's false: the macro was part of the limited API, so it needs to continue to work. Making PyThreadState fully opaque would be great, yes, but it can't be done in the limited API which included a public macro that touches the internals.
Maybe it is time to start working on the 4.x stable ABI, and do breaking changes there? IMO it would make more sense to start early than to design the whole thing inside the 4.0 development period, anyway.
+1
Again, this is completely against the concept of a stable ABI. PEP 386 explicitly says PyObject (ob_refcnt, ob_type) shall be accessible.

Le 11/02/2020 à 10:39, Petr Viktorin a écrit :
This sounds like a good idea to me. The current stable ABI was designed "in the dark" and it was exercised only very late, meaning there are probably lots of tiny decisions that appear now to be undesirable. Designing a new "stable ABI" would allow learning from the mistakes made.
Regards
Antoine.

On Tue, Feb 11, 2020 at 3:16 AM Antoine Pitrou <antoine@python.org> wrote:
I agree that it's probably time to start looking at stable ABI 4. Maybe that's HPy? Either way we seem to consistently have momentum on working on improving the stable ABI so now might be the time to figure out what ABI4 should be and get people to start moving/prepping for it by e.g. avoiding poking into structs in ABI3.

On Feb 27, 2020, at 10:35, Brett Cannon <brett@python.org> wrote:
Doubling down, maybe ABI 4 should be focused on reducing the need for extension authors to even write C code. IOW, a new API that would provide the services for Cython-like tool would consume. The vision would be that most extension authors could just write some type-hinted flavor of standard Python and this tool would generate the C for the extension module.
-Barry

On 27Feb2020 1940, Barry Warsaw wrote:
Most extension authors also need to write C code to interact with C modules - speedups are the minority and native lib wrappers are the majority, last time I counted. So that "standard" Python would also have to support native calls, and ideally in a less complicated way than via ctypes :)
I would vote for being very anti-optimisation in any new stable ABI, though. Import, __getattr__ and __call__ are the only Python operations you need to use any Python object, or to provide to be usable by any Python object. If we focus on those as the actual "channel" between runtime and extension, we can move a lot of boilerplate statically into the extension, where it is then safe from future changes in the runtime.
I also think there's a possibility to reintroduce optimisations through optional native interfaces, similar to how QueryInterface works in COM (e.g. you could ask an object for its Mapping interface, and if it gives you one, use it directly - otherwise you __getattr__ for "__getitem__" and go from there -- and all of this can bundle up inside the static boilerplate that gets compiled into the extension).
So you'd periodically want to recompile extensions to take advantage of any updates to the helper library, but fundamentally we've only got primitive-value-to-Python conversions, import, getattr and calls to worry about (not) breaking.
And you wouldn't get an automatic speed improvement in interop code just from compiling it, but I'm okay with that. If you can't get enough of a speedup in your actual native code to overcome the cost of interop, then you should stick with your already-working and far more portable Python code :)
Cheers, Steve

On Thu, Feb 27, 2020 at 11:59 AM Steve Dower <steve.dower@python.org> wrote:
I agree. There's a layering here of providing a C FFI that Cython and any such tool would compile down to. Plus embedding scenarios I think mean we can't get away from providing some C FFI.
I would vote for being very anti-optimisation in any new stable ABI, though.
I do think that we have all learned that simpler is better when it comes to a stable ABI and we would naturally trend towards that anyway. :)
But I think we all need to agree we want to work on a new stable ABI before we start trying to design it. ;)
-Brett

By removing the C API in the way you suggest you lose the possibility to extend Python itself natively.
The C API is not just about providing wrappers to other C libs, it's also about being able to add new functionality to Python itself, e.g. new data types, fast implementations of algorithms working directly on Python types, etc.
For wrappers there are a lot of different tools available and as long as the interface is high level, they are mostly fine. However, even when it comes to wrappers, there are low level cases where the data transfer becomes the bottleneck and Python would lose big time if we'd remove the more direct C interaction.
pyOpenSSL, for example, got more than two times slower when it switched to using the cffi interface, instead of going directly via the Python C interface to the OpenSSL C API. And this is typical, since FFI interfaces require a lot of unneeded copying of data in both directions, whereas the Python API can work directly on the raw data.
For a new ABI design, it would be better to stay at the C level, but make the APIs a little more consistent again. Keeping in mind that some details may better be hidden away.
BTW: People seem to forget that Python does have a few levels of abstraction in the C API:
- high: https://docs.python.org/3/c-api/veryhigh.html
- medium: https://docs.python.org/3/c-api/abstract.html
- low: all the rest
There more dimensions to the Python API and ABI. The latter is mostly about linking and distributing binaries. I would argue that recompiling for a new Python version is not really all that hard anymore (it was in the days when Martin proposed the stable ABI, mostly because Windows compilers were not readily available), so the concept of a stable ABI is a bit outdated.
I wouldn't really put much effort into maintaining binary compatibility anymore. For better maintenance, portability and consistency of the API is much more important.
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Feb 27 2020)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/

On Thu, Feb 27, 2020 at 1:18 PM M.-A. Lemburg <mal@egenix.com> wrote:
Except I hear from people who find it burdensome to have to rebuild their wheels for every version of Python that they support, especially with a shift to an annual release cadence for Python. So while consistency and portability are definitely important, I don't think we should ignore binary compatibility either.

On 27.02.2020 22:38, Brett Cannon wrote:
I'm not suggesting to ignore them, but today they are much less of a problem - with all the CI/CD going on, MS compilers being readily available, a lot of this can be automated.
I think it would be better to put up a service similar to https://www.piwheels.org/ for PyPI packages and a set of standard platforms to address concerns about having to recompile packages for new releases.
The PSF would certainly be able to fund a project like this. Ben pulled this off with the help of Dave and they did a pretty amazing job at working out many of the details for piwheels.
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, Feb 27 2020)
Python Projects, Coaching and Support ... https://www.egenix.com/ Python Product Development ... https://consulting.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 https://www.egenix.com/company/contact/ https://www.malemburg.com/

On 27Feb2020 2149, M.-A. Lemburg wrote:
The problem is that you also need to monitor, debug, and fix any packages that don't build properly. And it turns out, the best people to do this are the project maintainers. And they already have access to huge amounts of free CI time on GitHub, Azure Pipelines, Travis, etc. The better way is probably to offer to pay the projects directly if they have up-to-date and working wheels for "all" platforms.
I'm still very much in favour of having an easy-to-use and reliable stable ABI that saves projects from the trouble and also enables us to innovate on the runtime.
I'm very against anything that forces end-users to install a compiler. Based on the feedback I get about Windows today, this is far too big a burden.
Cheers, Steve

On 27/2/20 11:49 pm, M.-A. Lemburg wrote:
While this is off-topic, I cannot let the misconception that piwheels has solved all the problems go unanswered. There are endless problems caused by subtle changes in c interfaces, third-party libraries, and new python versions. One example: the numerous issues numpy has received because the piwheels people did not package an openblas implementation but relied on the distribution one, which was not always there https://github.com/numpy/numpy/issues?utf8=%E2%9C%93&q=is%3Aissue+raspberrypi. Automating the packaging story is definitely not a solved problem.
Matti

Brett Cannon schrieb am 27.02.20 um 22:38:
My experience (both personal and from what I hear) is that it's excessively more burdensome to add a new target platform (MacOS, Windows, …) than to add a new Python version to an existing build.
Once a platform is supported, adding a new Python version is just adding a new version entry to some kind of build script. With a couple of fixes every couple of years when deprecated features (Python or C-API) change and/or go away.
But to add a new platform, you often have to learn and create a completely new build setup, and sometimes even add a new build farm to your release process, which can take anything from days to weeks to months to get running nicely. Plus additional maintenance from time to time that involves remote debugging issues that you cannot reproduce on your local development platform. And that multiplies if you have non-trivial library dependencies that need building as well.
As long as PyPI is happy to load itself with lots of different wheels for each package release, and as long as there is a need (or use case) for binary wheels at all, I personally consider a stable ABI mostly a nice goodie for some, but nothing that solves the release problem for the bulk of the extension maintainers.
Stefan

On Thu, Feb 27, 2020 at 2:18 PM M.-A. Lemburg <mal@egenix.com> wrote:
That's fair. This really needs to be a data-driven process if we want the best outcome. How can we get enough data for that? Anecdotes like this are quite helpful, but still insufficient.
That's certainly the root of this discussion. :) We need to reach an informed consensus about how limited we can make the public C-API for extension authors.
It would probably make sense to have a stronger distinction for a public API for embedders (Include/Python-embedded.h?).
FWIW, I'm still in favor of the guidance Steve suggested about "rings and layers", which has some relationship to these abstraction layers.
+1
It seems like a lot of this discussion is already about a new "public" API anyway. :)
-eric

On 27Feb2020 2201, Eric Snow wrote:
No, it's not fair. FFI interfaces are terrible, and specifically they're terribl[y slow] at marshalling primitive values back and forth. That's not the same as distinguishing between a highly stable API and a version-by-version API.
Right. For that, we need actual designs and scenarios. Performance isn't the highest concern when trying to draw the line around certain stability guarantees we want to make.
It would probably make sense to have a stronger distinction for a public API for embedders (Include/Python-embedded.h?).
It's a side topic (that I have spent *many* hours thinking about), but I'd say embedders don't need as stable an API - they need easier ways to bundle the CPython build into their app so they can carry the version they want with them.
It seems like a lot of this discussion is already about a new "public" API anyway. :)
Only because we assume we'll get to design the new limited C API from the ground up. I don't think we've necessarily assumed we'll get to redesign the entire API, but we may be able to offer one that will be supported cross-versions that doesn't look like the current one (a supported PySIP library, if you like).
Cheers, Steve

I guess the 80/20 rule applies here, 80% of C extensions could probably use the stable ABI or an FFI interface and the rest can continue to use the version-specific API. That certainly applies to most extensions I’ve written, most of them are not performance sensitive and just expose preexisting native libraries.
Ronald

Antoine Pitrou schrieb am 11.02.20 um 12:15:
Just since you mentioned exercising the API, there's currently work being done by Eddie Elizondo et al. in Cython's master (pre-3.0) branch to enable C compile time support for the limited API.
https://github.com/cython/cython/issues/2542
You basically just pass "-DCYTHON_LIMITED_API" in your CFLAGS and the generated C code will restrict itself to that.
There are still some open issues, and we'll see what else we stumble into along the way. But most features are already supported and overall it seems to be getting there, slowly.
Apart from that, I'm happy to see ideas for a new and different C-API taking shape. We should definitely try to get some broader testing in the wild for those in an earlier state. That's the only way to realistically assess their usability, safety, performance … aspects, before making them an official part of a Python runtime (and hopefully more than one).
Stefan

Le mar. 11 févr. 2020 à 10:39, Petr Viktorin <encukou@gmail.com> a écrit :
I don't get your point. Since the limited C API exists, PyThreadState was always opaque. Extract of Python 3.6 pystate.h:
#ifdef Py_LIMITED_API typedef struct _ts PyThreadState; #else typedef struct _ts { struct _ts *prev; struct _ts *next; PyInterpreterState *interp; ... } PyThreadState; #endif
In Python 3.9, I fixed Py_EnterRecursiveCall() and Py_LeaveRecursiveCall() so they can be used in the limited C API. I tested: they didn't work before my change. It's not a recent regression.
My goal is to make the limited C API more complete to make it usable by more C extensions. The long term goal would be to only use it :-)
In my experience, the borders of the "stable ABI" remains blurry. For example, is PyGC_Head part of the stable ABI?
See "ABI breakage between Python 3.7.4 and 3.7.5: change in PyGC_Head structure": https://bugs.python.org/issue39599
For https://bugs.python.org/issue39573 : that's clearly a more ambitious goal than the initial design of the limited C API. It's a deliberate backward incompatible change. See the issue for the rationale.
The question is if we want this change, how it should be driven (propose a migration path), and how many projects using the limited C API would be broken if PyObject becomes opaque.
My early work on this issue is only to prepare this transition, without pushing any incompatible change, before a consensus is reached.
Victor
Night gathers, and now my watch begins. It shall not end until my death.

On 2020-02-11 12:54, Victor Stinner wrote:
Ah! I didn't realize code with the macros didn't compile! I misread the changelog entry. If it's not just me, would it be better to say the macros "failed to compile" instead of "didn’t work"/"never worked"?
Will old extensions compiled with PyObject_INIT() and PyObject_INIT_VAR() macros also continue to work?
Structs are not be part of the stable ABI by default: https://www.python.org/dev/peps/pep-0384/#structures
That was a regular ABI issue, not a stable ABI issue. Between 3.7.4 and 3.7.5, we need to avoid breaking the *entire* ABI, because extension module tags only include "37", not the micro version. A change like that should only be made in a x.y.0 release.
IMO it must not be done in the 3.x stable ABI, but it's a great candidate for the 4.0 one. That's my suggested migration path :)

Le mar. 11 févr. 2020 à 13:53, Petr Viktorin <encukou@gmail.com> a écrit :
I proposed https://github.com/python/cpython/pull/18461 to rephrase the What's New entry.
Will old extensions compiled with PyObject_INIT() and PyObject_INIT_VAR() macros also continue to work?
Yes.
Victor
Night gathers, and now my watch begins. It shall not end until my death.
participants (11)
-
Antoine Pitrou
-
Barry Warsaw
-
Brett Cannon
-
Eric Snow
-
M.-A. Lemburg
-
Matti Picus
-
Petr Viktorin
-
Ronald Oussoren
-
Stefan Behnel
-
Steve Dower
-
Victor Stinner