(extension) module dependencies in setuptools
I would like to propose a feature for setuptools: runtime enforcement of dependencies specified at build time by setup.py. I appreciate that "pkg_resources.require('foo==1.0')" works, but this requires a tedious update of version numbers in affected source files every time you upgrade foo and rebuild the target package. I'm thinking, in particular, of extension modules built on a particular version of another package with its own C interface. Think matplotlib.backends._ns_backend_agg depending on numpy. It would be really nice, in matplotlib's setup.py file, to say something like: from setuptools import setup, Extension import numpy numpy_include_dirs = numpy.get_numpy_include() setup(name='matplotlib', ext_modules=Extension('matplotlib.backends._ns_backend_agg', sources='src/backends/_ns_backend_agg.cpp', include_dirs=numpy_include_dirs, runtime_requires=[pkg_resources.get_distribution('numpy').as_requirement()], ) ) Alternatively, the whole package (not just the extension module) might depend on a particular version: from setuptools import setup import numpy numpy_include_dirs = numpy.get_numpy_include() setup(name='matplotlib', ext_modules=Extension('matplotlib.backends._ns_backend_agg', sources='src/backends/_ns_backend_agg.cpp', include_dirs=numpy_include_dirs, ) runtime_requires=[pkg_resources.get_distribution('numpy').as_requirement()], ) Now, if I installed an additional, newer numpy, a couple of things could happen: if my application imports matplotlib first, setuptools puts the appropriate (older) numpy into the global working_set and I get this older numpy when I do "import numpy". If my application imports (the newer) numpy first and then matplotlib, an exception is raised saying that matplotlib depends of versions such and such but version so and so is already imported. Because I'm thinking primarily of extension modules, there are additional reasons why I don't want to specify a runtime check using a hardcoded pkg_resources.require() in the package itself. First, the actual requirement may be a C-interface issue leading to segfaults and other nastiness if ignored or forgotten, thus justifying this easier way to specify dependencies. Second, an extension module, by definition, is not Python, so it will take more programming effort to write the call to pkg_resources.require(). Third, I really don't want to have to convince all the projects out there to modify multiple files to use setuptools. I'm already attempting to dispel enough anti-egg sentiment (for reasons I don't understand) resulting from slight changes to setup.py. What do folks think about this idea? Would such a feature be possible and desirable in setuptools? Cheers! Andrew
At 11:50 PM 1/23/2006 -0800, Andrew Straw wrote:
I would like to propose a feature for setuptools: runtime enforcement of dependencies specified at build time by setup.py. I appreciate that "pkg_resources.require('foo==1.0')" works, but this requires a tedious update of version numbers in affected source files every time you upgrade foo and rebuild the target package. I'm thinking, in particular, of extension modules built on a particular version of another package with its own C interface. Think matplotlib.backends._ns_backend_agg depending on numpy.
I'm getting close to starting on an overhaul of setuptools' experimental shared library support to allow dynamic linking to libraries included in a different egg than the one the extension is in. When I do that, there's definitely going to be some inter-egg dependency checking of sorts, except it will be based on *exact* version numbers, because it will be tied to the egg version you linked against. I suppose it's possible I could do something similar for other sorts of extensions, but it seems to me that the simple way to do inter-extension API checks in Python today is to have a .h supplied by one extension containing a version, and having a PyCObject you import from the target extension with a function pointer that gets the built version. You then compare the API version you compiled against with the API version present at runtime. The source code doesn't change; you're just verifying that the API version matches what you compiled against. Notice that this doesn't require setuptools in order to be useful. It's simply part of best practices for C extensions that provide a C API for other extensions to use. Python itself uses this technique as well.
I'm already attempting to dispel enough anti-egg sentiment (for reasons I don't understand) resulting from slight changes to setup.py.
Cognitive dissonance is sometimes a harsh taskmaster. :) Luckily my suggestion is entirely independent of eggs and setuptools.
Phillip J. Eby wrote:
At 11:50 PM 1/23/2006 -0800, Andrew Straw wrote:
I would like to propose a feature for setuptools: runtime enforcement of dependencies specified at build time by setup.py. I appreciate that "pkg_resources.require('foo==1.0')" works, but this requires a tedious update of version numbers in affected source files every time you upgrade foo and rebuild the target package. I'm thinking, in particular, of extension modules built on a particular version of another package with its own C interface. Think matplotlib.backends._ns_backend_agg depending on numpy.
I'm getting close to starting on an overhaul of setuptools' experimental shared library support to allow dynamic linking to libraries included in a different egg than the one the extension is in. When I do that, there's definitely going to be some inter-egg dependency checking of sorts, except it will be based on *exact* version numbers, because it will be tied to the egg version you linked against.
I suppose it's possible I could do something similar for other sorts of extensions, but it seems to me that the simple way to do inter-extension API checks in Python today is to have a .h supplied by one extension containing a version, and having a PyCObject you import from the target extension with a function pointer that gets the built version. You then compare the API version you compiled against with the API version present at runtime. The source code doesn't change; you're just verifying that the API version matches what you compiled against.
Notice that this doesn't require setuptools in order to be useful. It's simply part of best practices for C extensions that provide a C API for other extensions to use. Python itself uses this technique as well.
Absolutely. In fact, I just implemented exactly this in numpy. I was thinking, however, of the case of older extension modules which don't do this. I suppose I could submit patches there, too, but there are only so many hours in the day... Anyhow, when you implement version checking for shared libraries, if it looks straightforward to implement it for other extensions, you have my vote for going ahead and adding it in... Cheers! Andrew
participants (2)
-
Andrew Straw
-
Phillip J. Eby