Hi,
We're using distribute to package our software, but we'd like to do something different with the "sdist" command. We would like to recursively find all packages necessary to build our package from a base Python install, download these packages, and store them all in a directory inside our source distribution tarball; then, when the tarball is unpacked and setup.py is run, we want "install" to use those cached packages. (This is to support installation on machines that can't or shouldn't access the network.)
One solution I found in the distribute docs was to use easy_install
-zmaxd dirname package
on our package to generate and store dependency
eggs, and then easy_install -H none -f dirname
when installing the
package. Unfortunately this actually builds packages which have C
extensions and stuff like that in them, whereas we would like to only
ship the source and build the dependencies when our package is built (to
support multiple platforms). Also I guess we'd like setup.py install
to just work in the directory without the user having to care about the
fact that these source packages were shipped inside the tarball they
just extracted.
So I'm wondering if I can actually do this by importing something from distribute itself and writing some nontrivial code in setup.py. Can someone point me to what would be the best way to do this?
I'm guessing I'll need to start by subclassing Environment to create a fake "default environment" which looks like how a bare Python 2.7.3 (say) install would. Then I'd need to somehow modify "sdist" somewhere to make it download the dependency tarballs that would be necessary if you were installing our packages into that Environment, and also need to modify "install" to make it look in the correct directory. Is that about right?
Thanks and sorry for the simple question.
-Keshav
On Tue, Jun 12, 2012 at 4:40 PM, Keshav Kini keshav.kini@gmail.com wrote:
Hi,
We're using distribute to package our software, but we'd like to do something different with the "sdist" command. We would like to recursively find all packages necessary to build our package from a base Python install, download these packages, and store them all in a directory inside our source distribution tarball; then, when the tarball is unpacked and setup.py is run, we want "install" to use those cached packages. (This is to support installation on machines that can't or shouldn't access the network.)
One solution I found in the distribute docs was to use easy_install
-zmaxd dirname package
on our package to generate and store dependency
eggs, and then easy_install -H none -f dirname
when installing the
package. Unfortunately this actually builds packages which have C
extensions and stuff like that in them, whereas we would like to only
ship the source and build the dependencies when our package is built (to
support multiple platforms). Also I guess we'd like setup.py install
to just work in the directory without the user having to care about the
fact that these source packages were shipped inside the tarball they
just extracted.
So I'm wondering if I can actually do this by importing something from distribute itself and writing some nontrivial code in setup.py. Can someone point me to what would be the best way to do this?
I'm guessing I'll need to start by subclassing Environment to create a fake "default environment" which looks like how a bare Python 2.7.3 (say) install would. Then I'd need to somehow modify "sdist" somewhere to make it download the dependency tarballs that would be necessary if you were installing our packages into that Environment, and also need to modify "install" to make it look in the correct directory. Is that about right?
Thanks and sorry for the simple question.
I've found that the find_links option works fine for this sort of
thing. For example, you can put source tarballs of your dependencies
in a directory under your source tree like dependencies/
, and then
in setup.cfg add:
[easy_install] find_links = dependencies
Something like this has worked for me in the past, I think.
As for automatically downloading the source, adding a custom subclass of sdist seems like the way to go. The setuptools package contains the required machinery to download dependencies from PyPI (or another package index) with or without installing them.
Not sure I fully understand what the issue is regarding building C extensions.
Erik