Hi Stefan, On Sat, Mar 2, 2013 at 10:10 AM, Stefan Behnel <stefan_ml@behnel.de> wrote:
You say that "the API is fairly stable". What about the implementation? Will users want to install a new version next to the stdlib one in a couple of months,
I think that the implementation is fairly stable as well. The only place I can foresee some potential changes is in details like the location of temporary files, for example, which needs to be discussed (probably with people from python-dev too) as some point.
just because there was a bug in the parser in Python 3.4 that you still need to support because there's code that depends on it, or because there is this new feature that is required to make it work with library X, or ... ? What's the upgrade path in that case? How will you support this? What long-term guarantees do you give to users of the stdlib package?
I think these are general questions for any package that ends up in the stdlib. In the case of CFFI, it is now approaching a stability point. This is also because we are going to integrate it with the stdlib of PyPy soon. Bugs in the parser have not been found so far, but if there is any, we will treat it like we treat any other bug in the stdlib. For that matter, there is actually no obvious solution for the user either: he generally has to wait for the next micro release to have the bug fixed.
Or, in other words, will the normal fallback import for cffi look like this:
try: import stdlib_cffi except ImportError: import external_cffi
or will the majority of users end up prefering this order:
try: import external_cffi except ImportError: import stdlib_cffi
I would rather drop the external CFFI entirely, or keep it only to provide backports to older Python versions. I personally see no objection to call the stdlib one "cffi" too (but any other name is fine as well).
... Wouldn't it be simpler to target Windows with a binary than with dynamically compiled C code? Is there a way to translate an API description into a static ABI description for a known platform ahead of time, or do I have to implement this myself in a separate ABI code path by figuring out a suitable ABI description myself?
No, I believe that you missed this point: when you make "binary" distributions of a package with setup.py, it precompiles a library for CFFI too. So yes, you need a C compiler on machines where you develop the program, but not on machines where you install it. It's the same needs as when writing custom C extension modules by hand.
In which cases would users choose to use the C API support? And, is this dependency explicit or can I accidentally run into the dependency on a C compiler for my code without noticing?
A C compiler is clearly required: this is if and only if you call the function verify(), and pass it arguments that are not the same ones as the previous time (i.e. it's not in the cache).
* We try to be complete. For now some C99 constructs are not supported, but all C89 should be, including macros (and including macro “abuses”, which you can manually wrap in saner-looking C functions).
Ok, so the current status actually is that it's *not* complete, and that future versions will have to catch up in terms of C compatibility. So, why do you think it's a good time to get it into the stdlib *now*?
To be honest I don't have a precise list of C99 constructs missing. I used to know of a few of them, but these were eventually supported. It is unlikely to get completed, or if it is, a fairly slow process should be fine --- just like a substantial portion of the stdlib, which gets occasional updates from one Python version to the next.
You mentioned that it's fast under PyPy and slow under CPython, though. What would be the reason to use it under CPython then?
The reason is just ease of use. I pretend that it takes less effort (and little C knowledge), and is less prone to bugs and leaks, to write a perfectly working prototype of a module to access a random C library. I do not pretend that you'll get the top-most performance. For a lot of cases performance doesn't matter; and when it does, on CPython, you can really write a C extension module by hand (as long as you make sure to keep around the CFFI version for use by PyPy). This is how I see it, anyway. The fact that we are busy rewriting existing native well-tested CPython extensions with CFFI --- this is really only of use for PyPy.
Others have already mentioned the lack of C++ support. It's ok to say that you deliberately only want to support C, but it's also true that that's a substantial restriction.
I agree that it's a restriction, or rather a possible extension that is not done. I don't have plans to do it myself. Please also keep in mind that we pitch CFFI as a better ctypes, not as the ultimate tool to access any foreign language. A bientôt, Armin.