![](https://secure.gravatar.com/avatar/4d4776d83c84edb8ac6b2aa5580e6caa.jpg?s=120&d=mm&r=g)
Tarek Ziadé wrote:
Hello,
That's not a new idea, but I'd like to throw it here again.
Some modules/packages in the stdlib are pretty isolated, which means that they could be upgraded with no harm, independently from the rest. For example the unittest package, or the email package.
Here's an idea:
1 - add a version number in each package or module of the stdlib that is potentially upgradable
2 - create standalone releases of these modules/packages at PyPI, in a restricted area 'stdlib upgrades' that can be used only by core devs to upload new versions. Each release lists the precise Python versions it's compatible with.
Not a packaging expert, but I think in the context of a virtualenv this all makes sense. The ability to have a pip requirements file (for example) with stdlib-email==2.6 stdlib-unittest==2.7 would be a useful flexibility in my view. Any given application or library will only exercise a certain subset of stdlib after all. Also it might give you more confidence to upgrade to a higher python if you had this flexibility. Whether you wanted to incorporate this in the absence of a virtualenv is another question, I suppose.
4 - an upgraded package lands in a new specific site-packages directory and is loaded *before* the one in Lib
For a quick test, I added a "prioritize_site_packages" function to a virtualenv's site.py, which just rearranged sys.path so that anything containing the string 'site-packages' was prior to anything else. Would this be sufficient in the general case?