
Steven D'Aprano wrote:
As long as I cannot update version of standard library package separately from CPython version - No, they are not separate creatures ;) Why would you want to? That just sounds like adding extra complexity and
On Tue, Jun 16, 2020 at 12:41:58PM -0000, redradist@gmail.com wrote: pain for no benefit. Instead of requirements:
requires Python 3.5 or better
you have requirements:
Python 3.5 or better math 2.7 or better sys 2.1 or better glob 5.9 or better etc
This does not seem like an improvement to me. I like going to StackOverflow, and if I read a solution or recipe that says "tested with Python 3.8" I know it will run in 3.8, without having to guess what the minimum requirements for each module are. Some of the Linux distros already split the stdlib into pieces. This is a real pain, especially for beginners. The process changes from: $ dnf install python3 # or apt-get or whatever package manager you use
and everything documented at python.org Just Works straight out of the box, to a much more annoying process: $ dnf install python3
and then you have mysterious ImportErrors because some modules aren't installed, and you have to try to work out how to install them, and that's not an easy task: $ dnf search python3 | wc -l 3511
Why I want that ? Okay, here are the reasons: 1) Security issue, should be fixed as soon as possible without waiting 2 months or 1 year for next CPython release 2) Modules could be evolved independently that will allow to use some features of package earlier ... (crucial in our "Fast" World) 3) If library modularized I can remove parts of it on constrained environments (microcontroller) or in environments where we try to same each byte of disk memory Interfaces between modules would be thinner and visible that will allow to download as many packages as need for this module or library Modularized library will have two versions (run-time and itself version)