How to use for resilient deployments

Hi,
I would like to have a local PyPI mirror to be able to deploy python software even if PyPI is down. My deployment process is the ordinary git checkout + pip install -r requirements.txt.
I found devpi and it looks like it would do what I want, if I believe the feature listed very first on the homepage: "After files are first requested can work off-line and will try to re-check with pypi every 30 minutes by default" (even though I'm not quite sure why it needs to "recheck" anything)
However, I installed it and tried to make it work, but I didn't find that it offers the resilience I seek.
What I did is use the default index that mirrors PyPI. I installed a package through devpi, then simulated PyPI being down, and tried to install the package again, and it worked well, devpi served the package itself.
However, I then saw in the documentation something about packages being cached for 30 minutes. So I waited 30 minutes, tried again, and indeed then nothing worked, devpi (according to its logfile) tried contacting PyPI again and again, and at the end failed to serve the file to pip.
I thought that maybe it would make a difference if I used pip install with or without a version number, assuming that without a version number, devpi wants to contact PyPI to retrieve the latest version, but with a version number, it would be able to figure that it already has it locally... But no, version number or not in the pip command line, it fails.
My instinct is that I don't want to cache packages for a mere 30 minutes. I want a mirror. Once a package has been downloaded once, I want it to stay in the mirror forever. I don't understand why I would want it to expire.
Am I missing something? Should I set the mirror_cache_expiry to MAX_INT? Can devpi not do what I want? Am I wrong in wanting what I want?
Thanks a lot for your help :)

Hi!
What you want is what devpi should provide by default. Could you provide output from pip and devpi? For pip it's best to use -v for verbose output, for devpi, run it with --debug for more detailed output.
It's correct, that devpi will check PyPI after 30 minutes, but it only does so to see if there are new releases. If the file was downloaded before, it's stored locally. When PyPI is down, devpi should notice and serve the locally available files.
How did you simulate PyPI being down? Maybe the issue is there. It could be that the way it was simulated was causing devpi to fail.
Regards, Florian Schulze
On 12 Jun 2018, at 19:15, Hugo Chargois wrote:
Hi,
I would like to have a local PyPI mirror to be able to deploy python software even if PyPI is down. My deployment process is the ordinary git checkout + pip install -r requirements.txt.
I found devpi and it looks like it would do what I want, if I believe the feature listed very first on the homepage: "After files are first requested can work off-line and will try to re-check with pypi every 30 minutes by default" (even though I'm not quite sure why it needs to "recheck" anything)
However, I installed it and tried to make it work, but I didn't find that it offers the resilience I seek.
What I did is use the default index that mirrors PyPI. I installed a package through devpi, then simulated PyPI being down, and tried to install the package again, and it worked well, devpi served the package itself.
However, I then saw in the documentation something about packages being cached for 30 minutes. So I waited 30 minutes, tried again, and indeed then nothing worked, devpi (according to its logfile) tried contacting PyPI again and again, and at the end failed to serve the file to pip.
I thought that maybe it would make a difference if I used pip install with or without a version number, assuming that without a version number, devpi wants to contact PyPI to retrieve the latest version, but with a version number, it would be able to figure that it already has it locally... But no, version number or not in the pip command line, it fails.
My instinct is that I don't want to cache packages for a mere 30 minutes. I want a mirror. Once a package has been downloaded once, I want it to stay in the mirror forever. I don't understand why I would want it to expire.
Am I missing something? Should I set the mirror_cache_expiry to MAX_INT? Can devpi not do what I want? Am I wrong in wanting what I want?
Thanks a lot for your help :) _______________________________________________ devpi-dev mailing list -- devpi-dev@python.org To unsubscribe send an email to devpi-dev-leave@python.org https://mail.python.org/mm3/mailman3/lists/devpi-dev.python.org/

Thank you. You are right, it depends on how I simulate PyPI being down. I have now tried more ways to simulate it and I was able to make devpi work in one case but not the others.
I have opened an issue on GitHub because now I think it is indeed a "bug" and not just an incorrect usage or too high expectations on my part; and also because it allows me to attach the logs more easily.
https://github.com/devpi/devpi/issues/549
Cheers, Hugo.
On 13 June 2018 at 07:48, Florian Schulze mail@florian-schulze.net wrote:
Hi!
What you want is what devpi should provide by default. Could you provide output from pip and devpi? For pip it's best to use -v for verbose output, for devpi, run it with --debug for more detailed output.
It's correct, that devpi will check PyPI after 30 minutes, but it only does so to see if there are new releases. If the file was downloaded before, it's stored locally. When PyPI is down, devpi should notice and serve the locally available files.
How did you simulate PyPI being down? Maybe the issue is there. It could be that the way it was simulated was causing devpi to fail.
Regards, Florian Schulze
On 12 Jun 2018, at 19:15, Hugo Chargois wrote:
Hi,
I would like to have a local PyPI mirror to be able to deploy python software even if PyPI is down. My deployment process is the ordinary git checkout + pip install -r requirements.txt.
I found devpi and it looks like it would do what I want, if I believe the feature listed very first on the homepage: "After files are first requested can work off-line and will try to re-check with pypi every 30 minutes by default" (even though I'm not quite sure why it needs to "recheck" anything)
However, I installed it and tried to make it work, but I didn't find that it offers the resilience I seek.
What I did is use the default index that mirrors PyPI. I installed a package through devpi, then simulated PyPI being down, and tried to install the package again, and it worked well, devpi served the package itself.
However, I then saw in the documentation something about packages being cached for 30 minutes. So I waited 30 minutes, tried again, and indeed then nothing worked, devpi (according to its logfile) tried contacting PyPI again and again, and at the end failed to serve the file to pip.
I thought that maybe it would make a difference if I used pip install with or without a version number, assuming that without a version number, devpi wants to contact PyPI to retrieve the latest version, but with a version number, it would be able to figure that it already has it locally... But no, version number or not in the pip command line, it fails.
My instinct is that I don't want to cache packages for a mere 30 minutes. I want a mirror. Once a package has been downloaded once, I want it to stay in the mirror forever. I don't understand why I would want it to expire.
Am I missing something? Should I set the mirror_cache_expiry to MAX_INT? Can devpi not do what I want? Am I wrong in wanting what I want?
Thanks a lot for your help :) _______________________________________________ devpi-dev mailing list -- devpi-dev@python.org To unsubscribe send an email to devpi-dev-leave@python.org https://mail.python.org/mm3/mailman3/lists/devpi-dev.python.org/
participants (2)
-
Florian Schulze
-
Hugo Chargois