[Catalog-sig] (no subject)

Bob Ippolito bob at redivi.com
Thu May 5 20:37:42 CEST 2005


On May 5, 2005, at 1:38 PM, Ian Bicking wrote:

> Martin v. Löwis wrote:
>
>> Maurice Ling wrote:
>>
>>
>>> Given that (1) there can be multiple versions of Python installed  
>>> in a
>>> system, (2) each version maintains their own site-packages  
>>> directory and
>>> (3) if C modules are installed, they are not compatible with other
>>> versions of Python... Imagine a system administrator who had  
>>> installed
>>> 50 libraries and their dependencies in site-package of Python 2.3  
>>> and
>>> now has to do it for Python 2.4, can we make his life better?
>>>
>>
>>
>> I'd like to point out that this is a task that the "native" package
>> format can solve. For example, in Debian, when I use Debian packages
>> to install the 50 libraries, the current Debian Python policy manages
>> to update all the libraries from Python 2.3 to Python 2.4, when
>> the "official" Debian Python version becomes 2.4.
>>
>
> Just as a reminder, what Maurice talked about actually doesn't  
> relate to
> my initial concern.  Or, maybe I'll call it my distutils-love, where
> distutils does something currently that system packagers don't do.   
> I'm
> not so worried about per-python-version installation of packages --  
> for
> the most part this has worked well for some time.  It's the support of
> multiple *library* versions that keeps me using distutils, and  
> actually
> keeps me from using native packages for most things, except those  
> places
> where there's already a centralized system (e.g., I don't install more
> than one version of Postgres on a server, so I don't need more than  
> one
> version of psycopg).  Or things that are stable and practically
> equivalent to "standard" for me, like mxDateTime.
>
> But for most libraries I'm very reluctant to install them globally; I
> don't want to do monolithic upgrades of systems, I'd rather do
> incremental upgrades of specific applications, and handle any large
> upgrades through centralized tracking of the installations.
>
> In the C world this is more-or-less solved through a system of  
> symlinks,
> e.g., libjpeg.so, libjpeg.so.9, libjpeg.so.9.1, etc.  But we don't  
> have
> that for Python.  So if I have one application that needs SQLObject  
> 0.5,
> and another that needs SQLObject 0.6, I just can't use any system
> package manager.
>
> At first I was thinking, well, that's because of my specific situation
> -- multi-client host machines, applications that go quiet for  
> extended 
> periods of time and shouldn't be disturbed, and my own tendency to use
> lots of in-development libraries (my own and other people's).  But  
> as I
> think about it, it seems like it applies to most situations.  Subway,
> for instance, is built against a specific version of SQLObject.
> UltraGleeper is built against a different version (just to pick two
> projects I know of off the top of my head).  Well, technically they  
> used
> different package names for the different versions, but the general
> problem should be fairly obvious.  Making the installation robust and
> easy for these kinds of things -- both applications and frameworks  
> -- is
> what I think we are trying to do.  But as long as we have unversioned
> package installation and imports, I think it's a bad idea to do
> system-wide installation most packages.

In other words, Python has "DLL hell" and there isn't a damned thing  
that a system package manager can do about it other than say "sorry,  
you can't install FooBar and CheeseFactory at the same time, because  
FooBar needs BazWibble 1.0 and CheeseFactory needs BazWibble 2.0".

Therefore, there needs to be some mechanism to install stuff to an  
"app-packages" because it's not always possible or appropriate to put  
everything in "site-packages".

... however, the referenced PEP does nothing to help this problem  
really.

-bob



More information about the Catalog-sig mailing list