
I'm trying to catch up on this thread, so I may collapse some responses or refer to points others have brought up. On Jun 24, 2010, at 05:53 PM, Scott Dial wrote:
If the package has .so files that aren't compatible with other version of python, then what is the motivation for placing that in a shared location (since it can't actually be shared)?
I think Matthias has described the motivation for the Debian/Ubuntu case, and James describes Python's current search algorithm for a packages .py[c] and .so files. There are a few points that you've made that I want to respond to. You claim that versioned .so files scheme is "more complicated" than multiple version-specific search paths (if I understand your counter proposal correctly). It all depends on your point of view. From mine, a 100 line patch that almost nobody but (some) distros will care about or be affected by, and that only changes a fairly obscure build-time configuration, is much simpler than trying to make version-specific search paths work. If you build Python from source, you do not care about this patch and you'll never see its effects. If you get Python on a distribution that only gives you one version of Python at a time, you also will probably never care or see the effects of this patch. If you're a Debian or Ubuntu user who wants to use Python 3.2 and 3.3, you *might* care about it, but most likely it'll just work behind the scenes. If you're a Python packager or work on the Python infrastructure for one of those platforms, then you will care. About just sharing the py files. You say that would be acceptable to you, but it's actually a pretty big deal. If you're supporting two versions of Python, then every distro Python package doubles in size. Even with compression, you're talking longer download times and probably more critically, you've greatly increased CDROM space pressures. The Ubuntu CDROM is already essentially at capacity so doubling the size of all Python packages (most of which btw do not have extension modules) makes such an approach impossible. Moving to a DVD image has been discussed, but it is currently believed not in the best interest of users, especially on slow links, to do so at this time. The versioned .so approach will of course increase the size of packages by twice the contained .so file size, and that's already an uncomfortable but acceptable increase. It's acceptable because of the gain users get by having multiple versions of Python available and the fact that there aren't nearly as many extension modules as there are Python files. Doubling the size of .py files as well isn't acceptable.
But the only motivation for doing this with .pyc files is that the .py files are able to be shared, since the .pyc is an on-demand-generated, version-specific artifact (and not the source). The .so file is created offline by another toolchain, is version-specific, and presumably you are not suggesting that Python generate it on-demand.
Definitely not. pyc files are generated upon installation of the distro package, but of course the .so files must be compiled on a build machine and included in the distro package. The whole process is much simpler if the versioned .so files can just live in the same directory.
For packages that have .so files, won't the distro already have to build multiple copies of that package for all version of Python? So, why can't it place them in separate directories that are version-specific at that time? This is not the same as placing .py files that are version-agnostic into a version-agnostic location.
It's not a matter of "could", it's a matter of simplicity, and I think versioned .so files are the simplest solution given all the constraints. -Barry