[Python-ideas] stdlib upgrades
ianb at colorstudy.com
Wed Jun 2 17:03:30 CEST 2010
On Wed, Jun 2, 2010 at 5:13 AM, M.-A. Lemburg <mal at egenix.com> wrote:
> While I played with this idea a long time ago as well, I have
> since found that it causes more trouble than it's worth.
> Apart from having the user to maintain at least two different
> versioned packages (Python and (part of) the stdlib), it also
> causes problems if you use this Python installation for more
> than one project: it's easily possible to have project A require
> version 2 or a stdlib module and project B version 3 of that
> same module.
This exists for normal libraries currently, and using virtualenv I've found
it to be manageable. It does require process separation (and sys.path
separation) in some cases.
I agree that global upgrades are dangerous. distutils2/pip may be different
because projects don't generally get used except when managing a project,
and very few projects will require any particular version of these
If you then load both projects in an application, you end up
> either with a broken project A or B (depending on whether you have
> version 2 or 3 of that stdlib module installed), or you allow
> loading multiple versions of the same module, in which case you
> will likely break you application, since it will find multiple
> class implementations (and objects) for the the same instances.
> Things like exception catching, pickling (and esp. unpickling),
> security checks based on classes, interface adapters and even
> simply isinstance() checks would then fail in various hard to
> reproduce ways.
Yes, multiple versions of a library loaded at the same time is not a good
> IMHO, we've so far done well by issuing new Python patch level
> releases whenever there was a problem in the stdlib (and only
> Introducing new features by way of updates is left to
> minor releases, which then require more testing by the
> This additional testing is what causes many corporates to
> not follow the Python release cycle or skip a few minor
> releases: the work involved often just doesn't warrant the
> advantages of the added new features.
Yes, and so applications and libraries have to work around bugs instead of
using fixed versions, generally making upgrades even more danger-prone. In
the case of package management, the hardest libraries to support are those
libraries that have included a large number of fixes for installation
problems in their setup.py.
Futzing around with most of the standard library right now would just add
complexity, and applying changes that might be more aesthetic than
functional would be a really bad choice and lead to tedious discussions.
But new functionality can't usefully *just* exist in the standard library
because basically no one is using 2.7, few people are using 3.x, and lots of
people are using 2.5 or at best 2.6 -- so new functionality should be
available to all those people. Which means there *has* to be releases of
any new functionality. argparse was already released, and so there will be
"argparse" out in the wild that anyone can install on any version of Python
shadowing the existing module. unittest improvements are being released as
unittest2... meaning I guess that the "proper" way to use that functionality
if sys.version_info >= (2, 7):
import unittest2 as unittest
> The situations won't get any better if we start releasing
> partial or complete stdlib updates even more often.
That a stdlib release means potentially *any* part of the standard library
could have been upgraded (even though it probably won't be) probably will
throw people off.
The advantage of versions on specific functionality is that you can upgrade
just what you care about. It's much less burdensome to test something that
actually fixes a problem for you, and of course people do that all the time
with non-standard libraries.
Ian Bicking | http://blog.ianbicking.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Python-ideas