On Thu, 29 Nov 2018 10:05:47 -0500 "Benjamin Peterson" benjamin@python.org wrote:
On Thu, Nov 29, 2018, at 08:45, Antoine Pitrou wrote:
Le 29/11/2018 à 15:36, Benjamin Peterson a écrit :
I'd like to point the discussion is asymmetric here.
On the one hand, people who don't have access to PyPI would _really_ benefit from a larger stdlib with more batteries included.
On the other hand, people who have access to PyPI _don't_ benefit from having a slim stdlib. There's nothing virtuous or advantageous about having _less_ batteries included. Python doesn't become magically faster or more powerful by including less in its standard distribution: the best it does is make the distribution slightly smaller.
So there's really one bunch of people arguing for practical benefits, and another bunch of people arguing for mostly aesthetical or philosophical reasons.
I don't think it's asymmetric. People have raised several practical problems with a large stdlib in this thread. These include:
- The evelopment of stdlib modules slows to the rate of the Python release schedule.
Can you explain why that would be the case? As a matter of fact, the Python release schedule seems to remain largely the same even though we accumulate more stdlib modules and functionality.
The problem is the length of the Python release schedule. It means, in the extreme case, that stdlib modifications won't see the light of day for 2 years (feature freeze + 1.5 year release schedule). And that's only if people update Python immediately after a release. PyPI modules can evolve much more rapidly.
Ah, I realize I had misread as "the development of stdlib modules slows the rate of the Python release schedule" (instead of "slows *to* the rate").
Regards
Antoine.