[Python-ideas] stdlib upgrades

Ian Bicking ianb at colorstudy.com
Tue Jun 1 20:13:16 CEST 2010

Threading will probably break here as I wasn't on the list for the first

My concern with the standard library is that there's a couple things going

1. The standard library represents "accepted" functionality, kind of best
practice, kind of just conventional.  Everyone (roughly) knows what you are
talking about when you use things from the standard library.
2. The standard library has some firm backward compatibility guarantees.  It
also has some firm stability guarantees, especially within releases (though
in practice, nearly for eternity).
3. The standard library is kind of collectively owned; it's not up to the
whims of one person, and can't be abandoned.
4. The standard library is one big chunk of functionality, upgraded all
under one version number, and specifically works together (though in
practice cross-module refactorings are uncommon).

There's positive things about these features, but 4 really drives me nuts,
and I think is a strong disincentive to putting stuff into the standard
library.  For packaging I think 4 actively damages maintainability.

Packaging is at the intersection of several systems:

* Python versions
* Forward and backward compatibility with distributed libraries
* System policies (e.g., Debian has changed things around a lot in the last
few years)
* A whole other ecosystem of libraries outside of Python (e.g., binding to C
* Various developer toolkits, some Python specific (e.g., Cython) some not

I don't think it's practical to think that we can determine some scope of
packaging where it will be stable in the long term, all these things are
changing and many are changing without any particular concern for how it
affects Python (i.e., packaging must be reactive).  And frankly we clearly
do not have packaging figured out, we're still circling in on something...
and I think the circling will be more like a Strange Attractor than a sink

The issues exist for other libraries that aren't packaging-related, of
course, it's just worse for packaging.  argparse for instance is not
"done"... it has bugs that won't be fixed before release, and functionality
that it should reasonably include.  But there's no path for it to get
better.  Will it have new and better features in Python 3.3?  Who seriously
wants to write code that is only compatible with Python 3.3+ just because of
some feature in argparse?  Instead everyone will work around argparse as it
currently exists.  In the process they'll probably use undocumented APIs,
further calcifying the library and making future improvements disruptive.

It's not very specific to argparse, I think ElementTree has similar issues.
The json library is fairly unique in that it has a scope that can be
"done".  I don't know what to say about wsgiref... it's completely
irrelevant in Python 3 because it was upgraded along the Python schedule
despite being unready to be released (this is relatively harmless as I don't
think anyone is using wsgiref in Python 3).

So, this is the tension I see.  I think aspects of the standard library
process and its guarantees are useful, but the current process means
releasing code that isn't ready or not releasing code that should be
released, and neither is good practice and both compromise those
guarantees.  Lots of moving versions can indeed be difficult to manage...
though it can be made a lot easier with good practices.  Though even then
distutils2 (and pip) does not even fit into that... they both enter into the
workflow before you start working with libraries and versions, making them
somewhat unique (though also giving them some more flexibility as they are
not so strongly tied to the Python runtime, which is where stability
requirements are most needed).

Ian Bicking  |  http://blog.ianbicking.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20100601/53b784df/attachment.html>

More information about the Python-ideas mailing list