Hi,
I would like to have a local PyPI mirror to be able to deploy python
software even if PyPI is down. My deployment process is the ordinary
git checkout + pip install -r requirements.txt.
I found devpi and it looks like it would do what I want, if I believe
the feature listed very first on the homepage:
"After files are first requested can work off-line and will try to
re-check with pypi every 30 minutes by default" (even though I'm not
quite sure why it needs to "recheck" anything)
However, I installed it and tried to make it work, but I didn't find
that it offers the resilience I seek.
What I did is use the default index that mirrors PyPI. I installed a
package through devpi, then simulated PyPI being down, and tried to
install the package again, and it worked well, devpi served the
package itself.
However, I then saw in the documentation something about packages
being cached for 30 minutes. So I waited 30 minutes, tried again, and
indeed then nothing worked, devpi (according to its logfile) tried
contacting PyPI again and again, and at the end failed to serve the
file to pip.
I thought that maybe it would make a difference if I used pip install
with or without a version number, assuming that without a version
number, devpi wants to contact PyPI to retrieve the latest version,
but with a version number, it would be able to figure that it already
has it locally... But no, version number or not in the pip command
line, it fails.
My instinct is that I don't want to cache packages for a mere 30
minutes. I want a mirror. Once a package has been downloaded once, I
want it to stay in the mirror forever. I don't understand why I would
want it to expire.
Am I missing something? Should I set the mirror_cache_expiry to
MAX_INT? Can devpi not do what I want? Am I wrong in wanting what I
want?
Thanks a lot for your help :)