
At 11:50 PM 1/23/2006 -0800, Andrew Straw wrote:
I would like to propose a feature for setuptools: runtime enforcement of dependencies specified at build time by setup.py. I appreciate that "pkg_resources.require('foo==1.0')" works, but this requires a tedious update of version numbers in affected source files every time you upgrade foo and rebuild the target package. I'm thinking, in particular, of extension modules built on a particular version of another package with its own C interface. Think matplotlib.backends._ns_backend_agg depending on numpy.
I'm getting close to starting on an overhaul of setuptools' experimental shared library support to allow dynamic linking to libraries included in a different egg than the one the extension is in. When I do that, there's definitely going to be some inter-egg dependency checking of sorts, except it will be based on *exact* version numbers, because it will be tied to the egg version you linked against. I suppose it's possible I could do something similar for other sorts of extensions, but it seems to me that the simple way to do inter-extension API checks in Python today is to have a .h supplied by one extension containing a version, and having a PyCObject you import from the target extension with a function pointer that gets the built version. You then compare the API version you compiled against with the API version present at runtime. The source code doesn't change; you're just verifying that the API version matches what you compiled against. Notice that this doesn't require setuptools in order to be useful. It's simply part of best practices for C extensions that provide a C API for other extensions to use. Python itself uses this technique as well.
I'm already attempting to dispel enough anti-egg sentiment (for reasons I don't understand) resulting from slight changes to setup.py.
Cognitive dissonance is sometimes a harsh taskmaster. :) Luckily my suggestion is entirely independent of eggs and setuptools.