[Python-Dev] cffi in stdlib

Stefan Behnel stefan_ml at behnel.de
Sat Mar 2 10:10:24 CET 2013


Hi,

looks like no-one's taken over the role of the Advocatus Diaboli yet. =)

Maciej Fijalkowski, 26.02.2013 16:13:
> I would like to discuss on the language summit a potential inclusion
> of cffi[1] into stdlib. This is a project Armin Rigo has been working
> for a while, with some input from other developers. It seems that the
> main reason why people would prefer ctypes over cffi these days is
> "because it's included in stdlib", which is not generally the reason I
> would like to hear. Our calls to not use C extensions and to use an
> FFI instead has seen very limited success with ctypes and quite a lot
> more since cffi got released. The API is fairly stable right now with
> minor changes going in and it'll definitely stablize until Python 3.4
> release.

You say that "the API is fairly stable". What about the implementation?
Will users want to install a new version next to the stdlib one in a couple
of months, just because there was a bug in the parser in Python 3.4 that
you still need to support because there's code that depends on it, or
because there is this new feature that is required to make it work with
library X, or ... ? What's the upgrade path in that case? How will you
support this? What long-term guarantees do you give to users of the stdlib
package?

Or, in other words, will the normal fallback import for cffi look like this:

    try: import stdlib_cffi
    except ImportError: import external_cffi

or will the majority of users end up prefering this order:

    try: import external_cffi
    except ImportError: import stdlib_cffi


> * Work either at the level of the ABI (Application Binary Interface)
> or the API (Application Programming Interface). Usually, C libraries
> have a specified C API but often not an ABI (e.g. they may document a
> “struct” as having at least these fields, but maybe more). (ctypes
> works at the ABI level, whereas Cython and native C extensions work at
> the API level.)

Ok, so there are cases where you need a C compiler installed in order to
support the API. Which means that it will be a very complicated thing for
users to get working under Windows, for example, which then means that
users are actually best off not using the API-support feature if they want
portable code. Wouldn't it be simpler to target Windows with a binary than
with dynamically compiled C code? Is there a way to translate an API
description into a static ABI description for a known platform ahead of
time, or do I have to implement this myself in a separate ABI code path by
figuring out a suitable ABI description myself?

In which cases would users choose to use the C API support? And, is this
dependency explicit or can I accidentally run into the dependency on a C
compiler for my code without noticing?


> * We try to be complete. For now some C99 constructs are not
> supported, but all C89 should be, including macros (and including
> macro “abuses”, which you can manually wrap in saner-looking C
> functions).

Ok, so the current status actually is that it's *not* complete, and that
future versions will have to catch up in terms of C compatibility. So, why
do you think it's a good time to get it into the stdlib *now*?


> * We attempt to support both PyPy and CPython, with a reasonable path
> for other Python implementations like IronPython and Jython.

You mentioned that it's fast under PyPy and slow under CPython, though.
What would be the reason to use it under CPython then? Some of the projects
that are using it (you named a couple) also have equivalent (or maybe more
or less so) native implementations for CPython already. Do you have any
benchmarks available that compare those to their cffi versions under
CPython? Is the slowdown within any reasonable bounds?

Others have already mentioned the lack of C++ support. It's ok to say that
you deliberately only want to support C, but it's also true that that's a
substantial restriction.

Stefan




More information about the Python-Dev mailing list