Not that long ago Brett, Barry, and I were talking about how to get
extension authors to move away from the C-API. Cython is the obvious
choice, but it isn't an official tool nor does it necessarily make
sense to make it one. Regardless, it would help all parties involved
if there *were* an official tool that was part of CPython (i.e. in the
repo). Cython could build on top of it and extension authors would be
encouraged to use it or Cython. If such a thing makes sense, I figure
we would follow the pattern set by asyncio (relative to twisted).
On 2018-07-31 18:42, Brett Cannon wrote:
> Yes, what Eric is suggesting is a baseline tool like Cython be added to
> Python itself so it becomes the minimum, common tool that we point all
> extension authors to.
You didn't answer Victor's "stupid question": we can already point
extension authors to Cython. Why do we need a new tool which will very
likely have less features than the already-existing Cython?
> then we will have to provide an FFI compiler for people to use in at least
> the simple cases.
Cython is *not* an FFI tool. It can be used for FFI but that's just one
of its many use cases.
On 2018-07-31 14:52, Victor Stinner wrote:
> For C extensions which use (1), what's the benefit of moving to (2)?
Source compatibility between Python releases: the source code compiles
and works the same way on all Python versions (above a certain minimum
Last year, I gave a talk at the Language Summit (during Pycon) to
explain that CPython should become 2x faster to remain competitive.
IMHO all attempts to optimize Python (CPython forks) have failed
because they have been blocked by the C API which implies strict
I started to write a proposal to change the C API to hide
implementation details, to prepare CPython for future changes. It
allows to experimental optimization ideas without loosing support for
C extensions are a large part of the Python success. They are also the
reason why PyPy didn't replace CPython yet. PyPy cpyext remains slower
than CPython because PyPy has to mimick CPython which adds a
significant overhead (even if PyPy developers are working *hard* to
I created a new to discuss how to introduce backward incompatible
changes in the C API without breaking all C extensions:
The source can be found at:
I would like to create a team of people who want to work on this
project: CPython, PyPy, Cython and anyone who depends on the C API.
Contact me in private if you want to be added to the GitHub project.
I propose to discuss on the capi-sig mailing list since I would like
to involve people from various projects, and I don't want to bother
you with the high traffic of python-dev.
PS: I added some people as BCC ;-)
It looks like fixing the Python C API is a lot of work and might fail,
so I was wondering if it would be better to put effort into reducing
the C API by making more use of CFFI within the Python standard
* It feels like the C API is currently very broad because it provides
a second interface layer to a large chunk of the Python object space
(in addition to the one visible to Python code).
* I think it was Armin Rigo's observation when writing CFFI that there
is a better API that already exists -- i.e. C types and function
* If we're going to do a lot of work, would it not be better to push
instead for moving towards CFFI as the interface between C and Python?
I'm not sure that this is orthogonal to Victor's current proposal or
not -- it might just be one route to achieving it.
Advantages over the current route:
* There's a clearer direction for people to head it, so it's easier to
* A lot of work has already been done (e.g. it works for PyPy, a lot
of libraries already exist that use it).
Maybe Cython is an equally good option to CFFI in this scenario -- I
don't know Cython though, so I have no deep opinion on that.
The Python/C API and CFFI are different things which solve (mostly)
different problems. Briefly, the C API lets you access Python from C
while CFFI lets you access C from Python.
There are overlapping use cases, where you could replace the usage of
the C API by CFFI. But that certainly won't work in general (at least, I
don't see it).
Le 31/07/2018 à 18:25, Victor Stinner a écrit :
> 2018-07-31 18:03 GMT+02:00 Antoine Pitrou <solipsis(a)pitrou.net>:
>>> For example, PyPy uses different memory allocators depending on the
>>> scope and the lifetime of an object. I'm not sure that you can
>>> implement such optimization if you are stuck with reference counting.
>> But what does reference counting have to do with memory allocators
> I repeated what I heard from other developers and made my own guess.
> But it seems like you disagree. I think that it's better that you ask
> directly authors of other Python implementations like PyPy or Pyston
> their mood on the C API and if they see any opportunity to make their
> runtime faster by modifying the current C API.
Why don't you go and ask them? You are the one making claims such as
"changing the C API will allow making CPython 2x faster", and trying to
back them with allusions about how PyPy got 5x faster than CPython, as
if there was an obvious connection between the C API and PyPy's performance.
I'm not interested in finding arguments in favour of your position.
That's your job, not mine.
>>> How can we make CPython 2x faster? Why everybody, except of PyPy,
>>> failed to do that?
>> Because PyPy spent years working full time on a JIT compiler. It's also
>> written in (a dialect of) Python, which helps a lot with experimenting
>> and building abstractions, compared to C or even C++.
> Ok. But PyPy doesn't support natively the C API, and cpyext is still
> causing troubles to PyPy.
I'm sorry, but it seems you have a hard time sticking to a precise
topic. Are you talking about PyPy's performance, or about how to make
CPython faster, or are you talking about issues with cpyext?
If you want to make cpyext's life better, I would recommend asking the
cpyext authors for suggestions, instead of trying to second-guess their
> If we modify CPython to make similar changes than PyPy, we will hit
> the same issues.
I don't understand what this sentence means, sorry.
On 2018-07-31 17:59, Eric Snow wrote:
> Regardless, it would help all parties involved
> if there *were* an official tool that was part of CPython (i.e. in the
Why does the tool need to be "official"? I mean, is Sphinx considered to
be the "official" tool for Python documentation? I would say that it is
de facto standard and that's perfectly fine.
Le lundi 30 juillet 2018, Stefan Behnel <python_capi(a)behnel.de> a écrit :
> Gustavo Carneiro schrieb am 30.07.2018 um 16:18:
>> What is the relation to PEP 384?
> The "stable ABI" is a restricted API/ABI that breaks backwards
> compatibility with all extension modules out there. If you write code
> against this API, then you can make it work in newer CPython versions
> without recompilation. This does not apply to any previously existing C
> extension, which means that any changes to the non-stable parts of the
> C-API would still break the world.
Replacing macros accessing structures fields with function calls makes the
generated machine code compatible with the stable ABI:
It means that C extensions using PyList_GET_ITEM() become compatible with
new Python runtimes using the stable ABI, without having to modify their
The PEP 384 requires to replace PyList_GET_ITEM() with PyList_GetItem(). I
would like to continue the work on this PEP, to allow to use the stable ABI
on more C extensions.
Antoine Pitrou replied to my "Let's change to C API!" thread on
python-dev. I reply here on a new thread on the capi-sig mailing list.
2018-07-31 18:03 GMT+02:00 Antoine Pitrou <solipsis(a)pitrou.net>:
>> For example, PyPy uses different memory allocators depending on the
>> scope and the lifetime of an object. I'm not sure that you can
>> implement such optimization if you are stuck with reference counting.
> But what does reference counting have to do with memory allocators
I repeated what I heard from other developers and made my own guess.
But it seems like you disagree. I think that it's better that you ask
directly authors of other Python implementations like PyPy or Pyston
their mood on the C API and if they see any opportunity to make their
runtime faster by modifying the current C API.
>> Do you think that it's doable to port numpy to Cython? It's made of
>> 255K lines of C code.
> Numpy is a bit special as it exposes its own C API, so porting it
> entirely to Cython would be difficult (how do you expose a C macro in
> Cython?). Also, internally it has a lot of macro-generated code for
> specialized loop implementations (metaprogramming in C :-)).
> I suppose some bits could be (re)written in Cython. Actually, the
> numpy.random module is already a Cython module.
numpy is one example of piece of code which is old and big, and it
would take a lot of time to be converted to Cython, but nobody wants
to do it.
I expect that they are many other C extensions where the maintainer
doesn't want to spend time on converting their code to Cython. I don't
plan to modify hundreds of C extensions.
I did that in the past to port modules to Python 3 and it was painful.
I would like to design change small enough to get what I want, without
having to spend years on porting manually all C extensions one by one.
>> How can we make CPython 2x faster? Why everybody, except of PyPy,
>> failed to do that?
> Because PyPy spent years working full time on a JIT compiler. It's also
> written in (a dialect of) Python, which helps a lot with experimenting
> and building abstractions, compared to C or even C++.
Ok. But PyPy doesn't support natively the C API, and cpyext is still
causing troubles to PyPy.
If we modify CPython to make similar changes than PyPy, we will hit
the same issues.