Before I get started with the meat of this message (it's pretty long - for me, anyway), I should give a brief overview. I'm developing Python bindings for a set of interfaces called ITAPS, which contains a few independent libraries. If you want to look at my setup.py file (it's pretty scary), you can download the source at http://pypi.python.org/pypi/PyTAPS/0.9.1.
Basically, I have three somewhat-related questions, which I wrote a big pile of words about below. The short version is:
* Are multiple distributions inherently better/saner than one distribution with a handful of optional features? * What's the best way to detect whether libraries exist to turn said features on/off? * How do I install header files when using setuptools (or easy_install)?
Ok, now for the long version:
First, this distribution is fairly complicated, with a handful of setuptools Features controlling the various libraries I want to work with. I've waffled on whether this should be one distribution or several, but the former is a little problematic because the "core" module doesn't actually correlate to an independent thing on the C side. It really just piggybacks on each of the libraries. So basically there's no reason for the "core" module to exist in isolation, and it would actually be impossible to compile unless one of the "real" libraries was present. Is rolling this all together a terrible idea? Or a good one?
Second, having a single distribution with multiple features that can be enabled/disabled obviously requires a bit of work for configuration. Right now, it tries to detect if each library is installed, and if not, it disables that feature automatically. I mostly do this to make easy_install "just work" even if you don't have everything.
I should explain how I do configuration now: each library has a Makefile fragment that defines useful variables, e.g. $MYLIB_INCLUDEDIRS, that I read in and add into the appropriate places in the Extension. Each of my Features has a global --feature-path option that lets you point to the Makefile fragment. If you don't specify that, it steps through some environment variables and tries to find it. If it still can't find it, it'll assume you specified all the stuff you need as options to build_ext. Then I try to compile a simple file with the necessary header to make sure I got the paths right.
All of this work is done in an overrided version of build_ext (grabbing the Makefile fragment happens during finalize_options and the compilation test happens during run). This seems pretty hacky to me, especially since there's a config command in distutils that does some of what I want. However, I'm not sure how to actually *use* this. I assume I override the command, but what would it look like from a user's perspective? How do I connect it up to the build process?
Finally, I'd like to be able to install header files from my distribution. Looking through the distutils docs, there's an install_headers command that gets run as a part of install, and I can just specify headers=['foo.h'] as an option to setup. However, this doesn't seem to work with setuptools/distribute; it looks like they hijack the install command and, instead of actually running it directly, they manually run install_lib, install_data, etc.
This is made worse by the fact that, even if I stuck with vanilla distutils, easy_install appears to inject its version of Distribution into my setup.py anyway. (There's a way to work around that, but it's really messy. Think yanking the distutils.core Distribution class - "_Distribution" in the code - out of setuptools.dist. Barf.)
If you got all the way here, thanks for reading! Hopefully someone with a little more experience than me in this can give me a few pointers.