![](https://secure.gravatar.com/avatar/35aa6fee222660ce1382d45a7a9a92fd.jpg?s=120&d=mm&r=g)
On Nov 13, 2009, at 6:23 PM, Greg Ewing wrote:
Martin v. Löwis wrote:
Some of the Python maintainers have recently started objecting to this setup, asking that the standard library should be split into separate packages that are released and distributed independent of Python. Others of us feel strongly that such a change should not be made.
I'd be worried, because I would no longer be able to release an app or package and say "requires Python x.y". I'd have to list the version numbers of all the micro packages making up the standard library that I use.
Worse, I'd have to be aware of which ones I actually *do* use so I could mantain that list, something I don't have to think about at the moment.
"requires Python x.y" would imply a dependency on the working set of micro-packages which were shipped with that version of Python (or more specifically, any working set that was released within a particular Python release series). You would only need to list packages from the standard library as dependencies in special-case circumstances where you required a version higher or lower than what shipped with a particular release series of Python. It would perhaps increase the potential for people to get into situations where they update a Python with newer packages which makes it incompatibe with other Python applications. This problem would be mitgated by the fact that the standard library tends to be very API stable, so usually newer releases only contain minor bug fixes or performance enhancements and are unlikely to cause breakage. Package installation tools would also still continue to install into site- packages, or ideally in a virtualenv or script-generation environement like Buildout does. So installing updates to the standard library could be done only to support those applications which require them, but leave the default working set untouched for any other Python applications. Conversely, it may also help the standard library compatability in some situations, as I've seen people copy newer files into the standard library locations as a method of applying bug fixes, and given a better set of metadata it would then be more natural to use tools which allowed updates to happen in an orderly fashion. That's all given that splitting the standard library into individual packages also continues to release a single standard library. I don't really think releases small/medium/large sized standard libraries as was also discussed is a good idea. Maybe usage of tools such as virtualenv and buildout aren't yet widespread enough to alleviate people mucking up their installations in such a way that causes them pain. And this would also make it easier for people to develop applications which would be harder to pakcage into linux distributions or other package managers which don't allow for non-global updates. However, these are only theoretical concerns. It's concrete issue such as this one: http://stackoverflow.com/questions/1734373/including-package-data-with-distr... Where a developer uses something in the standard library, and a python- dev commiter provides a fix very quickly (yay Tarek!). But then that developer has to be told to wait a year until the next Python release, then wait until you've got the time to migrate the rest of your application to the new Python release, then you can finally use that fix, and in the meantime even though the issue has been solved you still need to workaround the problem! It's issues like this where it's hard not to want to avoid using standard library packages (beyond "core" modules which stable and only change very rarely lke os, sys, re, etc.) because there are unneccessary roadblocks between developer and user.