[Distutils] Packaging dependencies

Keshav Kini keshav.kini at gmail.com
Tue Jun 12 22:40:23 CEST 2012


Hi,

We're using distribute to package our software, but we'd like to do
something different with the "sdist" command. We would like to
recursively find all packages necessary to build our package from a base
Python install, download these packages, and store them all in a
directory inside our source distribution tarball; then, when the tarball
is unpacked and setup.py is run, we want "install" to use those cached
packages. (This is to support installation on machines that can't or
shouldn't access the network.)

One solution I found in the distribute docs was to use `easy_install
-zmaxd dirname package` on our package to generate and store dependency
eggs, and then `easy_install -H none -f dirname` when installing the
package. Unfortunately this actually builds packages which have C
extensions and stuff like that in them, whereas we would like to only
ship the source and build the dependencies when our package is built (to
support multiple platforms). Also I guess we'd like `setup.py install`
to just work in the directory without the user having to care about the
fact that these source packages were shipped inside the tarball they
just extracted.

So I'm wondering if I can actually do this by importing something from
distribute itself and writing some nontrivial code in setup.py. Can
someone point me to what would be the best way to do this?

I'm guessing I'll need to start by subclassing Environment to create a
fake "default environment" which looks like how a bare Python 2.7.3
(say) install would. Then I'd need to somehow modify "sdist" somewhere
to make it download the dependency tarballs that would be necessary if
you were installing our packages into that Environment, and also need to
modify "install" to make it look in the correct directory. Is that about
right?

Thanks and sorry for the simple question.

-Keshav



More information about the Distutils-SIG mailing list