[Python-Dev] Should standard library modules optimize for CPython?
Stefan Behnel
stefan_ml at behnel.de
Tue Jun 3 19:00:16 CEST 2014
Sturla Molden, 03.06.2014 17:13:
> Stefan Behnel wrote:
>
>> Thus my proposal to compile the modules in CPython with Cython, rather than
>> duplicating their code or making/keeping them CPython specific. I think
>> reducing the urge to reimplement something in C is a good thing.
>
> For algorithmic and numerical code, Numba has already proven that Python
> can be JIT compiled comparable to -O2 in C. For non-algorthmic code, the
> speed determinants are usually outside Python (e.g. the network
> connection). Numba is becoming what the "dead swallow" should have been.
> The question is rather should the standard library use a JIT compiler like
> Numba? Cython is great for writing C extensions while avoiding all the
> details of the Python C API. But for speeding up algorithmic code, Numba is
> easier to use.
I certainly agree that a JIT compiler can do much better optimisations on
Python code than a static compiler, especially data driven optimisations.
However, Numba comes with major dependencies, even runtime dependencies.
>From previous discussions on this list, I gathered that there are major
objections against adding such a large dependency to CPython since it can
also just be installed as an external package if users want to have it.
Static compilation, on the other hand, is a build time thing that adds no
dependencies that CPython doesn't have already. Distributions can even
package up the compiled .so files separately from the original .py/.pyc
files, if they feel like it, to make them selectively installable. So the
argument in favour is mostly a pragmatic one. If you can have 2-5x faster
code essentially for free, why not just go for it?
Stefan
More information about the Python-Dev
mailing list