----- Original Message -----
pkg_resources.requires() is our only current solution for parallel installation of incompatible versions. This can be made to work and is a lot better than the nothing we had before it was created, but also has quite a few issues (and it can be a nightmare to debug when it goes wrong).
Based on the exchanges with Mark McLoughlin the other week, and chatting to Matthias Klose here at the PyCon US sprints, I think I have a design that will let us support parallel installs in a way that builds on existing standards, while behaving more consistently in edge cases and without making sys.path ridiculously long even in systems with large numbers of potentially incompatible dependencies.
The core of this proposal is to create an updated version of the installation database format that defines semantics for *.pth files inside .dist-info directories.
Specifically, whereas *.pth files directly in site-packages are processed automatically when Python starts up, those inside dist-info directories would be processed only when explicitly requested (probably through a new distlib API). The processing of the *.pth file would insert it into the path immediately before the path entry containing the .dist-info directory (this is to avoid an issue with the pkg_resources insert-at-the-front-of-sys.path behaviour where system packages can end up shadowing those from a local source checkout, without running into the issue with append-to-the-end-of-sys.path where a specifically requested version is shadowed by a globally installed version)
To use CherryPy2 and CherryPy3 on Fedora as an example, what this would allow is for CherryPy3 to be installed normally (i.e. directly in site-packages), while CherryPy2 would be installed as a split install, with the .dist-info going into site-packages and the actual package going somewhere else (more on that below). A cherrypy2.pth file inside the dist-info directory would reference the external location where cherrypy 2.x can be found.
To use this at runtime, you would do something like:
distlib.some_new_requires_api("CherryPy (2.2)") import cherrypy
So what would be done when CherryPy 4 came? CherryPy 3 is installed directly in site-packages, so version 2 and 4 would be treated with split-install? It seems to me that this type of special casing is not what we want. If you develop on one machine and deploy on another machine, you have no guarantee that the standard installation of CherryPy is the same as on your system. That would force developers to actually always install their used versions by "split-install", so that they could make sure they always import the correct version. At this point, I will go to the Ruby world for example (please don't shout at me :). If you look at how RubyGems work, they put _every_ Gem in a versioned directory (therefore no special casing). When just "require 'foo'" is used, newest "foo" is imported, otherwise a specific version is imported if specified. I believe that we should head a similar way here, making the "split-install" the default (and the only way). Then if user uses standard
import cherrypy
Python would import the newest version. When using
distlib.some_new_requires_api("CherryPy (2.2)") import cherrypy
Python would import the specific version. This may actually turn out to be very useful, as you could place all the distlib calls into __init__.py of your package which would nicely separate this from the actual code (and we wouldn't need anything like Ruby Gemfiles). So am I completely wrong here or does this make sense to you? Slavek.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org http://mail.python.org/mailman/listinfo/distutils-sig
-- Regards, Bohuslav "Slavek" Kabrda.