Worried about Python release schedule and lack of stable C-API

I’m getting increasingly worried about the future of Python, and those worries have been exacerbated by the new yearly release rhythm, and the lack of a stable C-API that is full enough to entice extension writers to use it. PyPI packages and wheels are targeted to specific Python versions, which means that any project that depends on some of the larger extension packages (of which there are many, and many of which are must-have for many projects) now start lagging Python versions by years, because somewhere deep down in the dependency graph there is something that is still stuck at Python 3.8 (for example). I fully understand that 3.8 is good enough for the developers of that package, and that they have more pressing things to do than porting to 3.9 or 3.10, but it now keeps any project or package that depends on their software on 3.8 as well. And I also fully understand that some other developer who creates a new package that is essential to my application only targets the current Python release, or maybe one release back, but now if I need both the new package and and older one I’m up the well-known creek without a paddle. Building packages from source has become pretty much impossible nowadays, especially if your project is multi-platform and needs to interface to specific hardware, and you want to do the right thing with CI/CD builds and all that. On Linux/MacOS you have a chance when you try to specify all the dependencies for third party libraries and what not, but on Windows you’re dead in the water. And that is assuming you have the time and are smart enough to back port the new package to the old Python release, or the old package to the new Python release (and for the latter there’s probably a good reason why the developers haven’t done so already). Before you know it you have to install a couple of graphics card APIs for some obscure AI feature used by something you’ve never heard of, Cython for something else, and obscure vendor libraries for something else again. I think we really need to come up with some scheme whereby extension packages become more long-lived than a single Python release... -- Jack Jansen, <Jack.Jansen@cwi.nl>, http://www.cwi.nl/~jack If I can't dance I don't want to be part of your revolution -- Emma Goldman

And Steven D”Aprano also mentioned the stable ABI. The problem with the stable ABI is that very few developers are targeting it. I’m not sure why not, whether it has to do with incompleteness of the ABI, or with issues targeting it easily and your builds and then having pip/PyPI do the right things with wheels and all that. I’ve been on the capi-sig mailing list since its inception in 2007, but the discussions are really going over my head. I don’t understand what the problems are that keep people from targeting the stable ABI (or the various other attempts at standardising extensions over Python versions).
Yes, very much so. Wheels. Before we had wheels there were very few packages that were distributed in binary form, the NumPy family and the various GUI toolkits are the only ones that come to mind, and they had very active developer communities that tracked Python releases. Wheels are absolutely wonderful, but the downside is that everyone has come to depend on them. Before wheels, extension modules were often optional, in that many packages would provide their basic functionality in pure Python, and then have some performance-enhancing or functionality-extending optional extension modules. Wheels have obviated the need for that. So now everything depends on extension modules (and on external packages that depend on extension modules, and so on). -- Jack Jansen, <Jack.Jansen@cwi.nl>, http://www.cwi.nl/~jack If I can't dance I don't want to be part of your revolution -- Emma Goldman

On 26/09/2021 13.07, jack.jansen@cwi.nl wrote:
It takes some effort to port old extensions to stable ABI. Several old APIs are not supported in stable ABI extensions. For example developers have to port static type definitions to heap types. It's not complicated, but it takes some effort. The other issue is Cython. Stable releases of Cython do not support stable ABI yet. It's an experimental feature in Cython 3.0.0 alpha. Christian

The stable ABI is also not complete, although it should be complete enough for a lot of projects. A, fairly esoteric, issue I ran into is that it is currently not possible to define a class with a non-default meta class using the type-spec API (AFAIK), see #15870. And as you write “it takes some effort”, that alone likely reduces the amount of projects that migrate to the stable ABI esp. for projects that already have a CI/CD setup that creates binary wheels for you (for example using cibuildwheel). Ronald — Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/

On 27/09/2021 16.32, Ronald Oussoren via Python-Dev wrote:
Indeed, the stable ABI is not complete. I just figured out that limited API < 3.9 cannot define objects with weakref support or dict offset. The __weaklistoffset__ and __dictoffset__ PyMemberDefs were added in Python 3.9. Christian

On Sat, Sep 25, 2021 at 5:40 PM <jack.jansen@cwi.nl> wrote:
PyPI packages and wheels are targeted to specific Python versions, which means that any project that depends on some of the larger extension packages (of which there are many, and many of which are must-have for many projects) now start lagging Python versions by years, because somewhere deep down in the dependency graph there is something that is still stuck at Python 3.8 (for example).
Can you give some examples of the packages you're thinking of, that are prominent/must-have and stuck on years-old Pythons? -n -- Nathaniel J. Smith -- https://vorpus.org

Open3D is an example. Will finally move to Python 3.9 some time the coming month. Its dependency graph contains about 70 other packages. In this specific case, the underlying problem was that TensorFlow was stuck at 3.8. The TensorFlow codebase got ported in November 2020, then released early 2021. Then Open3D included the new Tensorflow (plus whatever else needed to be adapted) in their codebase in May. They’re now going through their release schedule, and their 0.14 release should be up on PyPI soon. -- Jack Jansen, <Jack.Jansen@cwi.nl>, http://www.cwi.nl/~jack If I can't dance I don't want to be part of your revolution -- Emma Goldman

On Sun, Sep 26, 2021 at 3:38 AM <jack.jansen@cwi.nl> wrote:
Open3D is an example. Will finally move to Python 3.9 some time the coming month. Its dependency graph contains about 70 other packages.
In this specific case, the underlying problem was that TensorFlow was stuck at 3.8. The TensorFlow codebase got ported in November 2020, then released early 2021. Then Open3D included the new Tensorflow (plus whatever else needed to be adapted) in their codebase in May. They’re now going through their release schedule, and their 0.14 release should be up on PyPI soon.
I took a minute to look up the release dates to fill in this timeline: Python 3.9 released: October 2020 Tensorflow adds 3.9 support: November 2020 Tensorflow v2.5.0 released with the new 3.9 support: May 2021 Open3d adds 3.9 support: May 2021 First Open3d release to include the new 3.9 support: ~October 2021 So it seems like in this case at least, the year long delay consists of ~1 month of porting work, and ~11 months of projects letting the finished code sit in their repos without shipping to users. It seems like the core problem here is that these projects don't consider it important to keep up with the latest Python release. I'm not sure what CPython upstream can do about that. Maybe you could lobby these projects to ship releases more promptly? By contrast, to pick a random library that uses the unstable C API extensively, NumPy is already shipping wheels for 3.10 -- and 3.10 isn't even out yet. So it's certainly possible to do, even for projects with a tiny fraction of Tensorflow's engineering budget. -n -- Nathaniel J. Smith -- https://vorpus.org

What I have heard repeatedly, from people who are paid to know, is that most users don’t care about the latest features, and would rather stick to a release until it becomes unsupported. (Extreme example: Python 2.) Numpy isn’t random, it’s at the bottom of the food chain for a large ecosystem or two — if it doesn’t support a new Python release, none of its dependent packages can even start porting. (I guess only Cython is even lower, but it’s a build-time tool. And indeed it has supported 3.10 for a long time.) —Guido On Mon, Sep 27, 2021 at 23:01 Nathaniel Smith <njs@pobox.com> wrote:
-- --Guido (mobile)

On Tue, Sep 28, 2021 at 12:40 AM Guido van Rossum <guido@python.org> wrote:
What I have heard repeatedly, from people who are paid to know, is that most users don’t care about the latest features, and would rather stick to a release until it becomes unsupported. (Extreme example: Python 2.)
Numpy isn’t random, it’s at the bottom of the food chain for a large ecosystem or two — if it doesn’t support a new Python release, none of its dependent packages can even start porting. (I guess only Cython is even lower, but it’s a build-time tool. And indeed it has supported 3.10 for a long time.)
Well, no, it wasn't entirely random :-). Being on the bottom of the food chain is important, but I don't think it's the full story -- Tensorflow is also at the bottom of a huge ecosystem. I think it's also related to NumPy being mostly volunteer-run, which means they're sensitive to feedback from individual enthusiasts, and enthusiasts are the most aggressive early adopters. OTOH Tensorflow is a huge commercial collaboration, and companies *hate* upgrading. Either way though, it doesn't seem to be anything to do with CPython's ABI stability or release cadence. -n -- Nathaniel J. Smith -- https://vorpus.org

What I have heard repeatedly, from people who are paid to know, is that most users don’t care about the latest features, and would rather stick to a release until it becomes unsupported. (Extreme example: Python 2.)
Just a quick note: Until black supports py3.10 fully, I'm sticking with py3.9 for dear life. Kind Regards, Abdur-Rahmaan Janhangeer about | blog github Mauritius

I think that's unfortunate, you can still use and format the subset of Python 3.10 syntax with black. You already do this on python 3.9 for example black doesn't support parenthesized with

Thanks for info, that's becoming a black spot in the tool it seems. It's more for production codes, now I can give it a try! Kind Regards, Abdur-Rahmaan Janhangeer about <https://compileralchemy.github.io/> | blog <https://www.pythonkitchen.com> github <https://github.com/Abdur-RahmaanJ> Mauritius

On Sun, Sep 26, 2021 at 01:14:18AM +0200, jack.jansen@cwi.nl wrote:
I’m getting increasingly worried about the future of Python,
That Python will become even more popular? TIOBE: second place, 0.16% below C. PYPL: first place, 12.3% above Java. RedMonk: equal second with Java. https://www.tiobe.com/tiobe-index/ https://pypl.github.io/PYPL.html https://redmonk.com/sogrady/2021/08/05/language-rankings-6-21/ We must be doing something right and maybe the problems that we see are not as big as we fear. Language popularity is not in and of itself important. It doesn't really matter whether Python is 1st or 10th in language popularity. But the CPython part (at least) of the Python language and ecosystem is clearly thriving. Do you have a reason to think that it is in danger in some way? Some factor that didn't apply equally in 2001 and 2011 as it does in 2021?
Admittedly, the yearly release schedule *is* a new policy. I don't remember the rationale for it or who it is supposed to benefit, and I don't know what we are doing to objectively decide whether the change made things better or worse. As for the C-API... Python is 30 years old. Has it ever had a stable C-API before now? Hasn't it *always* been the case that C packages have targetted a single version and need to be rebuilt from source on every release? These are not rhetorical questions, I genuinely do not know. I *think* that there was an attempt to make a stable C API back in 3.2 days: https://www.python.org/dev/peps/pep-0384/ but I don't know what difference it has made to extension writers in practice. From your description, it sounds like perhaps not as big a difference as we would have liked. Maybe extension writers are not using the stable C API? Is that even possible? Please excuse my ignorance. -- Steve

On 26/09/2021 05:21, Steven D'Aprano wrote: [snip]
No.
PyQt has used the stable ABI for many years. The main reason for using it is to reduce the number of wheels. The PyQt ecosystem currently contains 15 PyPI projects across 4 platforms supporting 5 Python versions (including v3.10). Without the stable ABI a new release would require 300 wheels. With the stable ABI it is a more manageable 60 wheels. However the stable ABI is still a second class citizen as it is still not possible (AFAIK) to specify a wheel name that doesn't need to explicitly include each supported Python version (rather than a minimum stable ABI version). In other words it doesn't solve the OP's concern about unmaintained older packages being able to be installed in newer versions of Python (even though those packages had been explicitly designed to do so). Phil

On Sun, Sep 26, 2021 at 3:51 AM Phil Thompson via Python-Dev < python-dev@python.org> wrote:
On 26/09/2021 05:21, Steven D'Aprano wrote:
[snip]
Actually you can do this. The list of compatible wheels for a platform starts at CPython 3.2 when the stable ABI was introduced and goes forward to the version of Python you are running. So you can build a wheel file that targets the oldest version of CPython that you are targeting and its version of the stable ABI and it is considered forward compatible. See `python -m pip debug --verbose` for the complete list of wheel tags that are supported for an interpreter.

On Tue, 28 Sep 2021, 6:55 am Brett Cannon, <brett@python.org> wrote:
I think Phil's point is a build side one: as far as I know, the process for getting one of those more generic file names is still to build a wheel with an overly precise name for the stable ABI declarations used, and then rename it. The correspondence between "I used these stable ABI declarations in my module build" and "I can use this more broadly accepted wheel name" is currently obscure enough that I couldn't tell you off the top of my head how to do it, and I contributed to the design of both sides of the equation. Actually improving the build ergonomics would be hard (and outside CPython's own scope), but offering a table in the stable ABI docs giving suggested wheel tags for different stable ABI declarations should be feasible, and would be useful to both folks renaming already built wheels and anyone working on improving the build automation tools. Cheers, Nick. _______________________________________________

On 05. 10. 21 8:59, Nick Coghlan wrote:
Indeed, thinking about proper wheel tags, and adding support for them in both pip/installer and setuptools/build/poetry/etc., would be a very helpful way to contribute to the stable ABI. I don't think I will be able to get to this any time soon. The current `abi3` tag doesn't encode the minimum ABI version. AFAIK that info should go in the "Requires-Python" wheel metadata, but there's not automation or clear guides for that. Putting it in the wheel tag might be a good idea. There are vague ideas floating around about removing old stable ABI features (hopefully after they're deprecated for 5-10 years); if there's a new wheel tag scheme it should be made with that possibility in mind.

On 05/10/2021 07:59, Nick Coghlan wrote:
Actually I was able to do what I wanted without renaming wheels... Specify 'py_limited_api=True' as an argument to Extension() (using setuptools v57.0.0). Specify... [bdist_wheel] py_limited_api = cp36 ...in setup.cfg (using wheel v0.34.2). The resulting wheel has a Python tag of 'cp36' and an ABI tag of 'abi3' for all platforms, which is interpreted by the current version of pip exactly as I want. I'm not sure if this is documented anywhere. Phil

jack.jansen@cwi.nl writes:
I’m getting increasingly worried about the future of Python,
You mean about the fact that Python has a plethora (rapidly growing!) of powerful, efficient extension packages and you want to use them all (perhaps recursively)? :-) I really don't think this bodes ill for the future of Python. Perhaps it bodes poorly for your development practices (and I'll try to tease out some ideas where we can improve the infrastructure for you), but there are plenty of applications where this plethora works really well, too.
I don't think "full API" is the problem (cf. Phil Thompson's post[1], where the stable API is good enough for PyQt). I think the problem is more likely that in initial development, they simply targeted the .h files in front of them.[2] Since then they haven't needed to update, and *neither has anyone else with resources or pull with with that package's devs*. Suggestion: Perhaps the documentation can be improved to make the stable API easier to conform to from the get-go. (I think the stable API is already distinguished in the .h files.) If "full API" *is* the problem, I doubt it's a solvable problem (except by extension users lobbying extension developers to use the stable API), because the stable API is a deliberate compromise between efficient communication between CPython and extensions, and flexibility for CPython to improve its internal implementation. If extension writers *deliberately* eschew the stable API, presumably they need to access private APIs and if those change, too bad.
This is where developers in your position can have the most impact, I think. When there's a new gotta-have extension, tell the extension's developers you need support for the stable API or ABI, or you can't use their package.
Most frequently, as you pointed out, the reason is that they're busy with other priorities, and it's just not their itch to scratch.
I would guess that a lot of extension packages are quite long-lived. First, as Phil Thompson points out for PyQt[1], the content of the wheel may very well work fine, but they have to actually make the wheels with the appropriate name (or maybe other metadata?), or they won't install in the new Python version's site-packages. I think this is something that Python (distutils-sig) could do something about, either by adding a way to make multi-version wheels, or providing a utility to retarget an existing wheel (maybe mv(1) is enough?!) Second, I'm sure there are plenty of extensions that can be ported to either new CPython versions or (preferably) to the stable API or ABI with minimal effort. Perhaps there could be a SIG or a channel in the tracker where people could request these ports. I'm suggesting this because of the "better things to do problem". Obviously, if the extension itself is unmaintained, its maintainer will do nothing. Even if it is maintained, it seems that its dev community does not prioritize these ports. But new developers looking for something to do might see it in "core" channels. Probably there are plenty of people around who can help them build wheels, and I believe a significant fraction of extensions without current version wheels will require little or no C changes. (OK, this doesn't involve a merge request to CPython, so it doesn't immediately set them on the path to core dev status, but we can award gold stars.) Footnotes: [1] <https://mail.python.org/archives/list/python-dev@python.org/message/RPDUNMG6...> [2] Maybe that's just me? OK, I'm busted!

And Steven D”Aprano also mentioned the stable ABI. The problem with the stable ABI is that very few developers are targeting it. I’m not sure why not, whether it has to do with incompleteness of the ABI, or with issues targeting it easily and your builds and then having pip/PyPI do the right things with wheels and all that. I’ve been on the capi-sig mailing list since its inception in 2007, but the discussions are really going over my head. I don’t understand what the problems are that keep people from targeting the stable ABI (or the various other attempts at standardising extensions over Python versions).
Yes, very much so. Wheels. Before we had wheels there were very few packages that were distributed in binary form, the NumPy family and the various GUI toolkits are the only ones that come to mind, and they had very active developer communities that tracked Python releases. Wheels are absolutely wonderful, but the downside is that everyone has come to depend on them. Before wheels, extension modules were often optional, in that many packages would provide their basic functionality in pure Python, and then have some performance-enhancing or functionality-extending optional extension modules. Wheels have obviated the need for that. So now everything depends on extension modules (and on external packages that depend on extension modules, and so on). -- Jack Jansen, <Jack.Jansen@cwi.nl>, http://www.cwi.nl/~jack If I can't dance I don't want to be part of your revolution -- Emma Goldman

On 26/09/2021 13.07, jack.jansen@cwi.nl wrote:
It takes some effort to port old extensions to stable ABI. Several old APIs are not supported in stable ABI extensions. For example developers have to port static type definitions to heap types. It's not complicated, but it takes some effort. The other issue is Cython. Stable releases of Cython do not support stable ABI yet. It's an experimental feature in Cython 3.0.0 alpha. Christian

The stable ABI is also not complete, although it should be complete enough for a lot of projects. A, fairly esoteric, issue I ran into is that it is currently not possible to define a class with a non-default meta class using the type-spec API (AFAIK), see #15870. And as you write “it takes some effort”, that alone likely reduces the amount of projects that migrate to the stable ABI esp. for projects that already have a CI/CD setup that creates binary wheels for you (for example using cibuildwheel). Ronald — Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/

On 27/09/2021 16.32, Ronald Oussoren via Python-Dev wrote:
Indeed, the stable ABI is not complete. I just figured out that limited API < 3.9 cannot define objects with weakref support or dict offset. The __weaklistoffset__ and __dictoffset__ PyMemberDefs were added in Python 3.9. Christian

On Sat, Sep 25, 2021 at 5:40 PM <jack.jansen@cwi.nl> wrote:
PyPI packages and wheels are targeted to specific Python versions, which means that any project that depends on some of the larger extension packages (of which there are many, and many of which are must-have for many projects) now start lagging Python versions by years, because somewhere deep down in the dependency graph there is something that is still stuck at Python 3.8 (for example).
Can you give some examples of the packages you're thinking of, that are prominent/must-have and stuck on years-old Pythons? -n -- Nathaniel J. Smith -- https://vorpus.org

Open3D is an example. Will finally move to Python 3.9 some time the coming month. Its dependency graph contains about 70 other packages. In this specific case, the underlying problem was that TensorFlow was stuck at 3.8. The TensorFlow codebase got ported in November 2020, then released early 2021. Then Open3D included the new Tensorflow (plus whatever else needed to be adapted) in their codebase in May. They’re now going through their release schedule, and their 0.14 release should be up on PyPI soon. -- Jack Jansen, <Jack.Jansen@cwi.nl>, http://www.cwi.nl/~jack If I can't dance I don't want to be part of your revolution -- Emma Goldman

On Sun, Sep 26, 2021 at 3:38 AM <jack.jansen@cwi.nl> wrote:
Open3D is an example. Will finally move to Python 3.9 some time the coming month. Its dependency graph contains about 70 other packages.
In this specific case, the underlying problem was that TensorFlow was stuck at 3.8. The TensorFlow codebase got ported in November 2020, then released early 2021. Then Open3D included the new Tensorflow (plus whatever else needed to be adapted) in their codebase in May. They’re now going through their release schedule, and their 0.14 release should be up on PyPI soon.
I took a minute to look up the release dates to fill in this timeline: Python 3.9 released: October 2020 Tensorflow adds 3.9 support: November 2020 Tensorflow v2.5.0 released with the new 3.9 support: May 2021 Open3d adds 3.9 support: May 2021 First Open3d release to include the new 3.9 support: ~October 2021 So it seems like in this case at least, the year long delay consists of ~1 month of porting work, and ~11 months of projects letting the finished code sit in their repos without shipping to users. It seems like the core problem here is that these projects don't consider it important to keep up with the latest Python release. I'm not sure what CPython upstream can do about that. Maybe you could lobby these projects to ship releases more promptly? By contrast, to pick a random library that uses the unstable C API extensively, NumPy is already shipping wheels for 3.10 -- and 3.10 isn't even out yet. So it's certainly possible to do, even for projects with a tiny fraction of Tensorflow's engineering budget. -n -- Nathaniel J. Smith -- https://vorpus.org

What I have heard repeatedly, from people who are paid to know, is that most users don’t care about the latest features, and would rather stick to a release until it becomes unsupported. (Extreme example: Python 2.) Numpy isn’t random, it’s at the bottom of the food chain for a large ecosystem or two — if it doesn’t support a new Python release, none of its dependent packages can even start porting. (I guess only Cython is even lower, but it’s a build-time tool. And indeed it has supported 3.10 for a long time.) —Guido On Mon, Sep 27, 2021 at 23:01 Nathaniel Smith <njs@pobox.com> wrote:
-- --Guido (mobile)

On Tue, Sep 28, 2021 at 12:40 AM Guido van Rossum <guido@python.org> wrote:
What I have heard repeatedly, from people who are paid to know, is that most users don’t care about the latest features, and would rather stick to a release until it becomes unsupported. (Extreme example: Python 2.)
Numpy isn’t random, it’s at the bottom of the food chain for a large ecosystem or two — if it doesn’t support a new Python release, none of its dependent packages can even start porting. (I guess only Cython is even lower, but it’s a build-time tool. And indeed it has supported 3.10 for a long time.)
Well, no, it wasn't entirely random :-). Being on the bottom of the food chain is important, but I don't think it's the full story -- Tensorflow is also at the bottom of a huge ecosystem. I think it's also related to NumPy being mostly volunteer-run, which means they're sensitive to feedback from individual enthusiasts, and enthusiasts are the most aggressive early adopters. OTOH Tensorflow is a huge commercial collaboration, and companies *hate* upgrading. Either way though, it doesn't seem to be anything to do with CPython's ABI stability or release cadence. -n -- Nathaniel J. Smith -- https://vorpus.org

What I have heard repeatedly, from people who are paid to know, is that most users don’t care about the latest features, and would rather stick to a release until it becomes unsupported. (Extreme example: Python 2.)
Just a quick note: Until black supports py3.10 fully, I'm sticking with py3.9 for dear life. Kind Regards, Abdur-Rahmaan Janhangeer about | blog github Mauritius

I think that's unfortunate, you can still use and format the subset of Python 3.10 syntax with black. You already do this on python 3.9 for example black doesn't support parenthesized with

Thanks for info, that's becoming a black spot in the tool it seems. It's more for production codes, now I can give it a try! Kind Regards, Abdur-Rahmaan Janhangeer about <https://compileralchemy.github.io/> | blog <https://www.pythonkitchen.com> github <https://github.com/Abdur-RahmaanJ> Mauritius

On Sun, Sep 26, 2021 at 01:14:18AM +0200, jack.jansen@cwi.nl wrote:
I’m getting increasingly worried about the future of Python,
That Python will become even more popular? TIOBE: second place, 0.16% below C. PYPL: first place, 12.3% above Java. RedMonk: equal second with Java. https://www.tiobe.com/tiobe-index/ https://pypl.github.io/PYPL.html https://redmonk.com/sogrady/2021/08/05/language-rankings-6-21/ We must be doing something right and maybe the problems that we see are not as big as we fear. Language popularity is not in and of itself important. It doesn't really matter whether Python is 1st or 10th in language popularity. But the CPython part (at least) of the Python language and ecosystem is clearly thriving. Do you have a reason to think that it is in danger in some way? Some factor that didn't apply equally in 2001 and 2011 as it does in 2021?
Admittedly, the yearly release schedule *is* a new policy. I don't remember the rationale for it or who it is supposed to benefit, and I don't know what we are doing to objectively decide whether the change made things better or worse. As for the C-API... Python is 30 years old. Has it ever had a stable C-API before now? Hasn't it *always* been the case that C packages have targetted a single version and need to be rebuilt from source on every release? These are not rhetorical questions, I genuinely do not know. I *think* that there was an attempt to make a stable C API back in 3.2 days: https://www.python.org/dev/peps/pep-0384/ but I don't know what difference it has made to extension writers in practice. From your description, it sounds like perhaps not as big a difference as we would have liked. Maybe extension writers are not using the stable C API? Is that even possible? Please excuse my ignorance. -- Steve

On 26/09/2021 05:21, Steven D'Aprano wrote: [snip]
No.
PyQt has used the stable ABI for many years. The main reason for using it is to reduce the number of wheels. The PyQt ecosystem currently contains 15 PyPI projects across 4 platforms supporting 5 Python versions (including v3.10). Without the stable ABI a new release would require 300 wheels. With the stable ABI it is a more manageable 60 wheels. However the stable ABI is still a second class citizen as it is still not possible (AFAIK) to specify a wheel name that doesn't need to explicitly include each supported Python version (rather than a minimum stable ABI version). In other words it doesn't solve the OP's concern about unmaintained older packages being able to be installed in newer versions of Python (even though those packages had been explicitly designed to do so). Phil

On Sun, Sep 26, 2021 at 3:51 AM Phil Thompson via Python-Dev < python-dev@python.org> wrote:
On 26/09/2021 05:21, Steven D'Aprano wrote:
[snip]
Actually you can do this. The list of compatible wheels for a platform starts at CPython 3.2 when the stable ABI was introduced and goes forward to the version of Python you are running. So you can build a wheel file that targets the oldest version of CPython that you are targeting and its version of the stable ABI and it is considered forward compatible. See `python -m pip debug --verbose` for the complete list of wheel tags that are supported for an interpreter.

On Tue, 28 Sep 2021, 6:55 am Brett Cannon, <brett@python.org> wrote:
I think Phil's point is a build side one: as far as I know, the process for getting one of those more generic file names is still to build a wheel with an overly precise name for the stable ABI declarations used, and then rename it. The correspondence between "I used these stable ABI declarations in my module build" and "I can use this more broadly accepted wheel name" is currently obscure enough that I couldn't tell you off the top of my head how to do it, and I contributed to the design of both sides of the equation. Actually improving the build ergonomics would be hard (and outside CPython's own scope), but offering a table in the stable ABI docs giving suggested wheel tags for different stable ABI declarations should be feasible, and would be useful to both folks renaming already built wheels and anyone working on improving the build automation tools. Cheers, Nick. _______________________________________________

On 05. 10. 21 8:59, Nick Coghlan wrote:
Indeed, thinking about proper wheel tags, and adding support for them in both pip/installer and setuptools/build/poetry/etc., would be a very helpful way to contribute to the stable ABI. I don't think I will be able to get to this any time soon. The current `abi3` tag doesn't encode the minimum ABI version. AFAIK that info should go in the "Requires-Python" wheel metadata, but there's not automation or clear guides for that. Putting it in the wheel tag might be a good idea. There are vague ideas floating around about removing old stable ABI features (hopefully after they're deprecated for 5-10 years); if there's a new wheel tag scheme it should be made with that possibility in mind.

On 05/10/2021 07:59, Nick Coghlan wrote:
Actually I was able to do what I wanted without renaming wheels... Specify 'py_limited_api=True' as an argument to Extension() (using setuptools v57.0.0). Specify... [bdist_wheel] py_limited_api = cp36 ...in setup.cfg (using wheel v0.34.2). The resulting wheel has a Python tag of 'cp36' and an ABI tag of 'abi3' for all platforms, which is interpreted by the current version of pip exactly as I want. I'm not sure if this is documented anywhere. Phil

jack.jansen@cwi.nl writes:
I’m getting increasingly worried about the future of Python,
You mean about the fact that Python has a plethora (rapidly growing!) of powerful, efficient extension packages and you want to use them all (perhaps recursively)? :-) I really don't think this bodes ill for the future of Python. Perhaps it bodes poorly for your development practices (and I'll try to tease out some ideas where we can improve the infrastructure for you), but there are plenty of applications where this plethora works really well, too.
I don't think "full API" is the problem (cf. Phil Thompson's post[1], where the stable API is good enough for PyQt). I think the problem is more likely that in initial development, they simply targeted the .h files in front of them.[2] Since then they haven't needed to update, and *neither has anyone else with resources or pull with with that package's devs*. Suggestion: Perhaps the documentation can be improved to make the stable API easier to conform to from the get-go. (I think the stable API is already distinguished in the .h files.) If "full API" *is* the problem, I doubt it's a solvable problem (except by extension users lobbying extension developers to use the stable API), because the stable API is a deliberate compromise between efficient communication between CPython and extensions, and flexibility for CPython to improve its internal implementation. If extension writers *deliberately* eschew the stable API, presumably they need to access private APIs and if those change, too bad.
This is where developers in your position can have the most impact, I think. When there's a new gotta-have extension, tell the extension's developers you need support for the stable API or ABI, or you can't use their package.
Most frequently, as you pointed out, the reason is that they're busy with other priorities, and it's just not their itch to scratch.
I would guess that a lot of extension packages are quite long-lived. First, as Phil Thompson points out for PyQt[1], the content of the wheel may very well work fine, but they have to actually make the wheels with the appropriate name (or maybe other metadata?), or they won't install in the new Python version's site-packages. I think this is something that Python (distutils-sig) could do something about, either by adding a way to make multi-version wheels, or providing a utility to retarget an existing wheel (maybe mv(1) is enough?!) Second, I'm sure there are plenty of extensions that can be ported to either new CPython versions or (preferably) to the stable API or ABI with minimal effort. Perhaps there could be a SIG or a channel in the tracker where people could request these ports. I'm suggesting this because of the "better things to do problem". Obviously, if the extension itself is unmaintained, its maintainer will do nothing. Even if it is maintained, it seems that its dev community does not prioritize these ports. But new developers looking for something to do might see it in "core" channels. Probably there are plenty of people around who can help them build wheels, and I believe a significant fraction of extensions without current version wheels will require little or no C changes. (OK, this doesn't involve a merge request to CPython, so it doesn't immediately set them on the path to core dev status, but we can award gold stars.) Footnotes: [1] <https://mail.python.org/archives/list/python-dev@python.org/message/RPDUNMG6...> [2] Maybe that's just me? OK, I'm busted!
participants (14)
-
Abdur-Rahmaan Janhangeer
-
Brett Cannon
-
Christian Heimes
-
Guido van Rossum
-
jack.jansen@cwi.nl
-
MRAB
-
Nathaniel Smith
-
Nick Coghlan
-
Petr Viktorin
-
Phil Thompson
-
Ronald Oussoren
-
Stephen J. Turnbull
-
Steven D'Aprano
-
Thomas Grainger