Le 29/11/2018 à 15:36, Benjamin Peterson a écrit :
I'd like to point the discussion is asymmetric here.
On the one hand, people who don't have access to PyPI would _really_ benefit from a larger stdlib with more batteries included.
On the other hand, people who have access to PyPI _don't_ benefit from having a slim stdlib. There's nothing virtuous or advantageous about having _less_ batteries included. Python doesn't become magically faster or more powerful by including less in its standard distribution: the best it does is make the distribution slightly smaller.
So there's really one bunch of people arguing for practical benefits, and another bunch of people arguing for mostly aesthetical or philosophical reasons.
I don't think it's asymmetric. People have raised several practical problems with a large stdlib in this thread. These include:
- The evelopment of stdlib modules slows to the rate of the Python release schedule.
Can you explain why that would be the case? As a matter of fact, the Python release schedule seems to remain largely the same even though we accumulate more stdlib modules and functionality.
The only risk is the prematurate inclusion of an unstable or ill-written module which may lengthen the stabilization cycle.
- stdlib modules become a permanent maintenance burden to CPython core developers.
This is true. We should only accept modules that we think are useful for a reasonable part of the user base.
- The blessed status of stdlib modules means that users might use a substandard stdlib modules when a better thirdparty alternative exists.
We can always mention the better thirdparty alternative in the docs where desired. But there are many cases where the stdlib module is good enough for what people use it.
For example, the upstream simplejson module may have slightly more advanced functionality, but for most purposes the stdlib json module fits the bill and spares you from tracking a separate dependency.