André Malo schrieb am 14.04.20 um 13:39:
I think, it does not serve well as a policy for CPython. Since we're talking hypotheticals right now, if Cython vanishes tomorrow, we're kind of left empty handed. Such kind of a runtime, if considered part of the compatibility "promise", should be provided by the core itself, no?
There was some discussion a while ago about integrating a stripped-down variant of Cython into CPython's stdlib. I was arguing against that because the selling point of Cython is really what it is, and stripping that down wouldn't lead to something equally helpful for users. I think it's good to have separate projects (and, in fact, it's more than one) deal with this need. In the end, it's an external tool, like your editor, your C compiler, your debugger and whatever else you need for developing Python extensions. It spits out C code and lets you do with it what you want. There's no reason it should be part of the CPython project, core or stdlib. It's even written in Python. If it doesn't work for you, you can fix it.
A good way to test that promise (or other implications like performance) might also be to rewrite the standard library extensions in Cython and see where it leads.
Not sure I understand what you're saying here. stdlib extension modules are currently written in C, with a bit of code generation. How is that different?
I personally see myself using the python-provided runtime (types, methods, GC), out of convenience (it's there, so why not use it). The vision of the future outlined here can easily lead to backing off from that and rebuilding all those things and really only keep touchpoints with python when it comes to interfacing with python itself. It's probably even desirable that way
That's actually not an uncommon thing to do. Some packages really only use Cython or pybind11 to wrap their otherwise native C or C++ code. It's a choice given specific organisational/project/developer constraints, and choices are good. Stefan