To speed up deployments, we use pip's --download-cache option, so all builds on the box share the downloaded tarballs.
We also use our own local mirror, and an extra apache directory for packages that are not on pypi.
I have also added a proxy that returns a 404 when pip tries to hit the net, to simulate the --allow-hosts option easy_install has.
Everything is driven from a Makefile. Example: https://github.com/mozilla-services/tokenserver/blob/master/Makefile The script to create RPMs is at https://github.com/mozilla-services/mopytools
thanks for the link to the makefile. we also use a download cache and a local mirror (although we're migrating to findlink dirs instead)
for us, that still leaves the issue of wasteful "setup.py build/install" time (for many dependencies) in our repetitive test/build environment.
that's my interest in modular binary build solutions like rpm.
the idea would be to stop rebuilding a large virtualenv for every test build.
but rather just rebuild a dev rpm the specific package(s) that's changing and a new dev master rpm that references those changes.
*then* go install the new master rpm on the test box, which will find most of it's rpm dependencies already satisfied.
but I'll admit I'm new to thinking about binary build solutions.