
Threading will probably break here as I wasn't on the list for the first email... My concern with the standard library is that there's a couple things going on: 1. The standard library represents "accepted" functionality, kind of best practice, kind of just conventional. Everyone (roughly) knows what you are talking about when you use things from the standard library. 2. The standard library has some firm backward compatibility guarantees. It also has some firm stability guarantees, especially within releases (though in practice, nearly for eternity). 3. The standard library is kind of collectively owned; it's not up to the whims of one person, and can't be abandoned. 4. The standard library is one big chunk of functionality, upgraded all under one version number, and specifically works together (though in practice cross-module refactorings are uncommon). There's positive things about these features, but 4 really drives me nuts, and I think is a strong disincentive to putting stuff into the standard library. For packaging I think 4 actively damages maintainability. Packaging is at the intersection of several systems: * Python versions * Forward and backward compatibility with distributed libraries * System policies (e.g., Debian has changed things around a lot in the last few years) * A whole other ecosystem of libraries outside of Python (e.g., binding to C libraries) * Various developer toolkits, some Python specific (e.g., Cython) some not (gcc) I don't think it's practical to think that we can determine some scope of packaging where it will be stable in the long term, all these things are changing and many are changing without any particular concern for how it affects Python (i.e., packaging must be reactive). And frankly we clearly do not have packaging figured out, we're still circling in on something... and I think the circling will be more like a Strange Attractor than a sink drain. The issues exist for other libraries that aren't packaging-related, of course, it's just worse for packaging. argparse for instance is not "done"... it has bugs that won't be fixed before release, and functionality that it should reasonably include. But there's no path for it to get better. Will it have new and better features in Python 3.3? Who seriously wants to write code that is only compatible with Python 3.3+ just because of some feature in argparse? Instead everyone will work around argparse as it currently exists. In the process they'll probably use undocumented APIs, further calcifying the library and making future improvements disruptive. It's not very specific to argparse, I think ElementTree has similar issues. The json library is fairly unique in that it has a scope that can be "done". I don't know what to say about wsgiref... it's completely irrelevant in Python 3 because it was upgraded along the Python schedule despite being unready to be released (this is relatively harmless as I don't think anyone is using wsgiref in Python 3). So, this is the tension I see. I think aspects of the standard library process and its guarantees are useful, but the current process means releasing code that isn't ready or not releasing code that should be released, and neither is good practice and both compromise those guarantees. Lots of moving versions can indeed be difficult to manage... though it can be made a lot easier with good practices. Though even then distutils2 (and pip) does not even fit into that... they both enter into the workflow before you start working with libraries and versions, making them somewhat unique (though also giving them some more flexibility as they are not so strongly tied to the Python runtime, which is where stability requirements are most needed). -- Ian Bicking | http://blog.ianbicking.org

On Tue, Jun 1, 2010 at 8:13 PM, Ian Bicking <ianb@colorstudy.com> wrote: [..]
Are you suggesting to have a third layer ? * Python * stdlib * stdlib-extras (distutils2, pip, etc) is that what some people called a "sumo" release of Python ? Tarek -- Tarek Ziadé | http://ziade.org

On Tue, Jun 1, 2010 at 11:13, Ian Bicking <ianb@colorstudy.com> wrote:
I can only see two scenarios that might be considered acceptable to address these issues. One is that when new modules are accepted into the stdlib they are flagged with a ExpermintalWarning so that people know that no backwards-compatibility promises have been made yet. That gets the module more exposure and gets python-dev real-world feedback to fix issues before the module calcifies into a strong backwards-compatibility. With that experience more proper decisions can be made as to how to change things (e.g. the logging module's default timestamp including microseconds which strptime cannot parse). Otherwise we shift to an annual release schedule, but alternate Python versions have a language moratorium. That would mean only new language features every two years, but a new stdlib annually. But one thing I can tell you is that having separate module releases of what is in the stdlib under the same name or doing a separate stdlib release will not happen. Python-dev as a whole does not like this idea and I don't see that changing.

On Tue, Jun 1, 2010 at 9:22 PM, Brett Cannon <brett@python.org> wrote:
I'm actually partial to this idea - the stdlib, by it's very existence has to evolve more quickly than the language itself, and it should fundamentally see more releases to stay up to date, and slightly fresh. jesse

On Tue, Jun 1, 2010 at 8:22 PM, Brett Cannon <brett@python.org> wrote:
I have no particular interest in changing the stdlib as it exists now. It is what it is, I don't care if there's extra stuff in it and I'm now long settled into working around all bugs I encounter. While I think there are past situations that exemplify certain problems, I'm really just bringing them up as examples. But pip and distutils2 aren't settled into anything, and I don't want us to retrace bad paths just because they are so well trod. -- Ian Bicking | http://blog.ianbicking.org

On Tue, 1 Jun 2010 18:22:38 -0700 Brett Cannon <brett@python.org> wrote:
One is that when new modules are accepted into the stdlib they are flagged with a ExpermintalWarning
Are you advocating this specific spelling?
I think this has already been shot down by Guido (I think I was the one who asked last time :-)). Basically, even if you aren't adding new language features, you are still compelling people to upgrade to a new version with (very probably) slight compatibility annoyances.

On 06/01/2010 08:22 PM, Brett Cannon wrote:
Would it be possible to have a future_lib that gets enabled with something like... from __future__ import future_lib These *new* library modules and packages won't be visible by default. Maybe they stay there until the next major version or possible some set period of time. Ron

On Mon, Jun 7, 2010 at 2:33 AM, Ron Adam <rrr@ronadam.com> wrote:
This doesn't seem workable, since __future__ imports have local effects, but the side effect here is really about the global module space. How would you expect this to work if the "old" version has already been imported? -Fred -- Fred L. Drake, Jr. <fdrake at gmail.com> "Chaos is the score upon which reality is written." --Henry Miller

Or perhaps: from experimental import new_module This is kind of a guarantee that the interface will change; since at some point, if new_module is "calcified", this will have to be changed to just: import new_module For experimental language features, maybe: from __experimental__ import new_feature This makes it clear that new_feature may change (perhaps even not be adapted?), vs the from __future__ semantics. Is it too complicated to try to differentiate between the decision of whether some capability will be provided or not, vs ironing out the API for that capability? For example, from experimental import new_capability means that there is no commitment for new_capability at all -- it may simply be dropped entirely. The danger of using this is that new_capability may simply disappear completely with no replacement. While, from proposed import new_capability represents a commitment that new_capability will be provided at some point, but the API will likely change. Here the danger of using it is that you will likely have to change your program to conform to a new API. A capability might start as "experimental", and if the value of it is demonstrated, move to "proposed" to work out the details before mainstreaming it. -Bruce On Mon, Jun 7, 2010 at 2:33 AM, Ron Adam <rrr@ronadam.com> wrote:

On 06/07/2010 09:37 AM, Bruce Frederiksen wrote:
I was thinking of something a bit more limited in scope. ie... only those modules already accepted for inclusion in a future release. But you have the concept I was thinking of correct. However experimental and proposed libraries is quite a lot more. I think the svn sandbox directory sort of serves that purpose now although it isn't organized in any way that makes it easy for someone to tell what is what as far as being experimental, proposed, accepted and under active development, or something that is just lying around for future or past reference. Maybe a bit of organizing the sandbox with categorized sub-folders would be good? I'm really was just throwing out an idea in regard to limiting some of the problems of after the fact fixes or changes that can cause some problems. Ie.. give a new module a bit more exposure to a wider audience before its actually included. Ron

On Mon, Jun 7, 2010 at 1:33 AM, Ron Adam <rrr@ronadam.com> wrote:
The only place where any of this seems even slightly useful would be a library closely associated with Python itself, e.g., a new ast module or something with imports. Everything else should be developed as an external installable library. At least things that are importable (str.partition for instance isn't something you import). -- Ian Bicking | http://blog.ianbicking.org

On Tue, Jun 1, 2010 at 8:13 PM, Ian Bicking <ianb@colorstudy.com> wrote: [..]
Are you suggesting to have a third layer ? * Python * stdlib * stdlib-extras (distutils2, pip, etc) is that what some people called a "sumo" release of Python ? Tarek -- Tarek Ziadé | http://ziade.org

On Tue, Jun 1, 2010 at 11:13, Ian Bicking <ianb@colorstudy.com> wrote:
I can only see two scenarios that might be considered acceptable to address these issues. One is that when new modules are accepted into the stdlib they are flagged with a ExpermintalWarning so that people know that no backwards-compatibility promises have been made yet. That gets the module more exposure and gets python-dev real-world feedback to fix issues before the module calcifies into a strong backwards-compatibility. With that experience more proper decisions can be made as to how to change things (e.g. the logging module's default timestamp including microseconds which strptime cannot parse). Otherwise we shift to an annual release schedule, but alternate Python versions have a language moratorium. That would mean only new language features every two years, but a new stdlib annually. But one thing I can tell you is that having separate module releases of what is in the stdlib under the same name or doing a separate stdlib release will not happen. Python-dev as a whole does not like this idea and I don't see that changing.

On Tue, Jun 1, 2010 at 9:22 PM, Brett Cannon <brett@python.org> wrote:
I'm actually partial to this idea - the stdlib, by it's very existence has to evolve more quickly than the language itself, and it should fundamentally see more releases to stay up to date, and slightly fresh. jesse

On Tue, Jun 1, 2010 at 8:22 PM, Brett Cannon <brett@python.org> wrote:
I have no particular interest in changing the stdlib as it exists now. It is what it is, I don't care if there's extra stuff in it and I'm now long settled into working around all bugs I encounter. While I think there are past situations that exemplify certain problems, I'm really just bringing them up as examples. But pip and distutils2 aren't settled into anything, and I don't want us to retrace bad paths just because they are so well trod. -- Ian Bicking | http://blog.ianbicking.org

On Tue, 1 Jun 2010 18:22:38 -0700 Brett Cannon <brett@python.org> wrote:
One is that when new modules are accepted into the stdlib they are flagged with a ExpermintalWarning
Are you advocating this specific spelling?
I think this has already been shot down by Guido (I think I was the one who asked last time :-)). Basically, even if you aren't adding new language features, you are still compelling people to upgrade to a new version with (very probably) slight compatibility annoyances.

On 06/01/2010 08:22 PM, Brett Cannon wrote:
Would it be possible to have a future_lib that gets enabled with something like... from __future__ import future_lib These *new* library modules and packages won't be visible by default. Maybe they stay there until the next major version or possible some set period of time. Ron

On Mon, Jun 7, 2010 at 2:33 AM, Ron Adam <rrr@ronadam.com> wrote:
This doesn't seem workable, since __future__ imports have local effects, but the side effect here is really about the global module space. How would you expect this to work if the "old" version has already been imported? -Fred -- Fred L. Drake, Jr. <fdrake at gmail.com> "Chaos is the score upon which reality is written." --Henry Miller

Or perhaps: from experimental import new_module This is kind of a guarantee that the interface will change; since at some point, if new_module is "calcified", this will have to be changed to just: import new_module For experimental language features, maybe: from __experimental__ import new_feature This makes it clear that new_feature may change (perhaps even not be adapted?), vs the from __future__ semantics. Is it too complicated to try to differentiate between the decision of whether some capability will be provided or not, vs ironing out the API for that capability? For example, from experimental import new_capability means that there is no commitment for new_capability at all -- it may simply be dropped entirely. The danger of using this is that new_capability may simply disappear completely with no replacement. While, from proposed import new_capability represents a commitment that new_capability will be provided at some point, but the API will likely change. Here the danger of using it is that you will likely have to change your program to conform to a new API. A capability might start as "experimental", and if the value of it is demonstrated, move to "proposed" to work out the details before mainstreaming it. -Bruce On Mon, Jun 7, 2010 at 2:33 AM, Ron Adam <rrr@ronadam.com> wrote:

On 06/07/2010 09:37 AM, Bruce Frederiksen wrote:
I was thinking of something a bit more limited in scope. ie... only those modules already accepted for inclusion in a future release. But you have the concept I was thinking of correct. However experimental and proposed libraries is quite a lot more. I think the svn sandbox directory sort of serves that purpose now although it isn't organized in any way that makes it easy for someone to tell what is what as far as being experimental, proposed, accepted and under active development, or something that is just lying around for future or past reference. Maybe a bit of organizing the sandbox with categorized sub-folders would be good? I'm really was just throwing out an idea in regard to limiting some of the problems of after the fact fixes or changes that can cause some problems. Ie.. give a new module a bit more exposure to a wider audience before its actually included. Ron

On Mon, Jun 7, 2010 at 1:33 AM, Ron Adam <rrr@ronadam.com> wrote:
The only place where any of this seems even slightly useful would be a library closely associated with Python itself, e.g., a new ast module or something with imports. Everything else should be developed as an external installable library. At least things that are importable (str.partition for instance isn't something you import). -- Ian Bicking | http://blog.ianbicking.org
participants (11)
-
Antoine Pitrou
-
Brett Cannon
-
Bruce Frederiksen
-
Fred Drake
-
geremy condra
-
Ian Bicking
-
Jesse Noller
-
Lie Ryan
-
MRAB
-
Ron Adam
-
Tarek Ziadé