On 22Feb2019 0838, Victor Stinner wrote:
As the migration to Python 3 showed us, any migration is painful and takes a very long time. Your project would only be successful if the critical mass of C extension move it. But it's really unclear to me why a maintainer would like to invest time in migrating to these new rings.
tooling and documentation should be provided to help to migrate to such new model.
More generally, I don't see the direct benefit for PyPy or other Python implementation. The proposal doesn't say anything about reference counting and GC which are the blocker issues of the current C API.
To be clear, I'm not proposing any migration here, just trying to define the language and architecture in a way that both:
The migration proposals will come later :)
Le jeu. 21 févr. 2019 à 19:26, Steve Dower email@example.com a écrit :
For context as you continue reading, these are the API **rings** provided by CPython:
- Python ring (equivalent of the Python language)
- CPython ring (CPython-specific APIs)
- Internal ring (intended for internal use only)
For a concrete example,
PyObject_GetItemis part of the Python ring while
PyDict_GetItemis in the CPython ring.
I like that :-)
This is the kind of problem that is solved with "rings" - it gives us clear categories to put APIs into so that we can discuss them.
They are called rings because if you are in a particular ring, you are also able to use everything in outer rings too. (Think of a bullseye, not a Venn diagram.)
I've actually based the rings on how you've been adjusting the Include directory layout, so I'd expect you'd be okay with it ;)
For CPython, including
Python.hshould only provide access to the Python ring. Accessing any other rings should produce a compile error.
That's a major backward incompatible change, right?
Right now, Python.h gives also what you call the "CPython ring".
Right - typo on my part. This should have said "With Py_LIMITED_ABI defined".
Compatibility requirements for the CPython API match the CPython major.minor version. Specifically, code relying on the CPython API should only break or change behaviour if the major.minor version changes.
Currently, there is a "stable API" which is supposed to be compatible on multiple Python versions, not only a specific X.Y. Projects like PyQt uses this "stable ABI" (sorry, I wrote API and then ABI, the difference is subtle and confuses me). You don't say anything about it here.
Yes, the stable API represents the Python API ring (currently), and it should be changed more carefully than the CPython ring. I *think* there are mainly "normal" Python operations in the stable API, but again, the point is that right now CPython does not perfectly match a good design here. So I just want to put out what a good design looks like so that we can know what direction we've been moving.
I'm open to having better explanations here of what compatibility guarantees we make here.
Lower layers are required to maintain backwards compatibility more strictly than the layers above them.
Hum, I don't understand well the separation between the CPython implementation and the "Python ring" (API). CPython core evolves way more faster than its API. Here you only talk about the API, right?
Layers and rings are totally separate concepts.
The CPython implementation is currently all four layers and all three rings, and it will always stay that way. That said, it is a good architecture to define layers, and it is good for our extenders to define API rings (like public/stable/internal/etc.).
Components within a layer that depend on other components within that layer must be treated as a single component for determining whether it may be included or omitted.
Currently, the whole API are based on some key features:
- PyObject object model and C structures
- CPython GC implementation
- CPython memory allocators
I'm not sure how adding more layers will help to move away from these "legacy contraints".
As Nick said, I'm not trying to introduce new layers, but to formalise what we already have.
That said, I did mix up my initial proposal between the initialization sequence and cross-component dependencies. Memory allocation needs to be part of the core layer, as you can't "be Python" without memory allocation (but it should still be pluggable). Reference counting and GC are also in this layer.
What it means is that higher layers like the Python code in the standard library shouldn't *rely* on Python being reference counted, because it's too many layers away. But of course, this is impossible! So we ought to be aware of where we do make assumptions like this so that it's easier for layers to be changed.
But apart from things like pluggable memory allocators, I looked at where we might "swap things out". For example, on Windows we have different implementations of a lot of functions, so these should be in the Platform adaptation layer (to adapt to the platform).
And then there may be other runtimes that can use most of the standard library, which means those parts of the standard library are in their own layer and you can run them against a different Platform adaptation layer and different core layer (such as Jython or PyPy).
And then there are parts of the standard library that you may not want, such as sockets (e.g. when embedding Python in an application). But if you remove sockets, what else do you have to remove? If the layers are done properly, you shouldn't have to remove anything in a *lower* layer, but only things in the same (or a higher) layer. (And this one is a real need for me, so I have to figure out how to remove sockets anyway and all the stdlib that depends on it anyway.)
But layers is definitely more complicated. Once you get to the top layer, it looks more like a dependency tree I think. Though the Core and Platform adaptation layers are fairly standard.
Oh wow, that's going a little bit too far into the complex "Python initialization API" problem. I would prefer to discuss it in a separated PEP / thread.
Sure, but only when we're planning to change it. PEP 432 already exists for changing this, and all I really want to do is document it.
And maybe my examples here are a bit too much "imagining the future", but that's because we have so many layer violations right now that it's hard to give good examples of where it is at present ;)
Thanks for the feedback!