Modularize Python library

Hi all, me again ... ) I think it would be desired to modularize Python library and then to provide part of standard library through PyPi It will add possibility to evolve separately run-time and standard library I think standard library should be as small as needed and all other functionality should be provided through PyPi It will be similar as `Rust` did ... they have pretty small `standard library` and `core library` If Python would have small `standard library` and `core library` it would be add standardization for basic functional that should have some Python implementation ... For example: 1) MicroPython (`libpythoncore` - should be provided entirely, `libpythonstd` - could be provided partially) 2) Brython (`libpythoncore` - should be provided entirely, `libpythonstd` - could be provided partially) 3) RustPython (`libpythoncore` - should be provided entirely, `libpythonstd` - should be provided entirely), if `libpythonstd` would be small enough than it would be possible to implement it 4) PyPy (`libpythoncore` - should be provided entirely, `libpythonstd` - should be provided entirely) All other modules and packages could be provided as separate packages (json, xml and etc.) Also it is just guy's thoughts that loves programming and try to bring best things from other ecosystem because they proves by time ;)

On 16/06/2020 10:23, redradist@gmail.com wrote:
I think it would be desired to modularize Python library and then to provide part of standard library through PyPi It will add possibility to evolve separately run-time and standard library
Uh, aren't the runtime and standard library already separate creatures?
I think standard library should be as small as needed and all other functionality should be provided through PyPi It will be similar as `Rust` did ... they have pretty small `standard library` and `core library`
This is absolutely contrary to Python's "batteries included" philosophy, so you are going to have to work to justify it.
If Python would have small `standard library` and `core library` it would be add standardization for basic functional that should have some Python implementation ...
I would have expected (and your examples show) *less* standardization, not more. -- Rhodri James *-* Kynesim Ltd

Hi, the "batteries included" argument was a huge selling points years ago (when Aaron Watters wrote "Internet Programming With Python", for instance) but I think the situation has changed quite a bit since then. My own experience is that for most of the packages in the standard library, there exists "better" (for some sense of "better") packages on PyPI. The reasons why this is so are obvious or have already been discussed (e.g. in https://www.python.org/dev/peps/pep-0413/ ). It has also been discussed that some people can't rely on anything outside the standard packages, due to the constraints of their company or their field. I can see some benefits in the idea, if I understand it correctly, of a two-tiers standard library ("core" and "standard"). This could complement or build upon the work of https://www.python.org/dev/peps/pep-0594/ S. On Tue, Jun 16, 2020 at 12:14 PM Rhodri James <rhodri@kynesim.co.uk> wrote:
On 16/06/2020 10:23, redradist@gmail.com wrote:
I think it would be desired to modularize Python library and then to provide part of standard library through PyPi It will add possibility to evolve separately run-time and standard library
Uh, aren't the runtime and standard library already separate creatures?
I think standard library should be as small as needed and all other functionality should be provided through PyPi It will be similar as `Rust` did ... they have pretty small `standard library` and `core library`
This is absolutely contrary to Python's "batteries included" philosophy, so you are going to have to work to justify it.
If Python would have small `standard library` and `core library` it would be add standardization for basic functional that should have some Python implementation ...
I would have expected (and your examples show) *less* standardization, not more.
-- Rhodri James *-* Kynesim Ltd _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/ADNUZD... Code of Conduct: http://python.org/psf/codeofconduct/
-- Stefane Fermigier - http://fermigier.com/ - http://twitter.com/sfermigier - http://linkedin.com/in/sfermigier Founder & CEO, Abilian - Enterprise Social Software - http://www.abilian.com/ Chairman, National Council for Free & Open Source Software (CNLL) - http://cnll.fr/ Founder & Organiser, PyParis & PyData Paris - http://pyparis.org/ & http://pydata.fr/

This is still true. There are some of us that will scream very very loud if the std lib disappears from Python installers. However, I think there could easily be a way to satisfy both parties here. How difficult would it be to release both full and minimal installers for every release? That way everyone could have it their way. --Edwin On 6/16/2020 6:51 AM, Stéfane Fermigier wrote:
It has also been discussed that some people can't rely on anything outside the standard packages, due to the constraints of their company or their field.

Edwin Zimmerman writes:
This is still true. There are some of us that will scream very very loud if the std lib disappears from Python installers. However, I think there could easily be a way to satisfy both parties here. How difficult would it be to release both full and minimal installers for every release? That way everyone could have it their way.
Not too hard, but that's a distro problem. Python is a development organization. It's traditionally been considered that producing a series of "official" installers for Windows and Mac is useful, but it will be a heavy lift to go further.

Stephen J. Turnbull wrote:
Edwin Zimmerman writes:
This is still true. There are some of us that will scream very very loud if the std lib disappears from Python installers. However, I think there could easily be a way to satisfy both parties here. How difficult would it be to release both full and minimal installers for every release? That way everyone could have it their way. Not too hard, but that's a distro problem. Python is a development organization. It's traditionally been considered that producing a series of "official" installers for Windows and Mac is useful, but it will be a heavy lift to go further.
All .NET Core library modularized and this mean I could update any package without updating underling run-time and it is cool It will allow develop any Python package independently and receive update sooner (for example security issue should be fixed as soon as possible in our "Fast" World ;)

Edwin Zimmerman wrote:
This is still true. There are some of us that will scream very very loud if the std lib disappears from Python installers. However, I think there could easily be a way to satisfy both parties here. How difficult would it be to release both full and minimal installers for every release? That way everyone could have it their way. --Edwin On 6/16/2020 6:51 AM, Stéfane Fermigier wrote:
It has also been discussed that some people can't rely on anything outside the standard packages, due to the constraints of their company or their field.
This attempt will at least make some standardization of libraries for different Python implementations: CPython, PyPy, Brython, RustPython, MicroPython, Pycopy For example if user writes application in RustPython for WASM which will be executed in browser it could use only libcore api for base logic packages and some specific platform api within implementation of Python This attempt was done previously by .NET with .NET Standard that allows to write libraries that could be run on any implementation of .NET (.NET Framework or Mono) Since in Python we have a lot of different implementations it would be nice to know that if I use the libcore functionality it will run everywhere and if I use libstd it will run only where requirements of the system allow to use full package or where author of implementation implemented those packages

On Tue, 16 Jun 2020 at 11:53, Stéfane Fermigier <sf@fermigier.com> wrote:
the "batteries included" argument was a huge selling points years ago (when Aaron Watters wrote "Internet Programming With Python", for instance) but I think the situation has changed quite a bit since then.
For some people/situations, yes. For others, definitely not. "Batteries included" is still an extremely significant selling point for some users of Python. Anyone contributing to this discussion should be *very* careful not to assume that everyone using Python works in a similar environment to them. For example, many people work in environments where Internet access is tightly controlled one way or another. Not all countries or regions have reliable or widespread internet access. And some people simply need scripts to be easy to run, so "install Python and run this script" is as complex as it can get. Paul

Paul Moore wrote:
the "batteries included" argument was a huge selling points years ago (when Aaron Watters wrote "Internet Programming With Python", for instance) but I think the situation has changed quite a bit since then. For some people/situations, yes. For others, definitely not. "Batteries included" is still an extremely significant selling point for some users of Python. Anyone contributing to this discussion should be very careful not to assume that everyone using Python works in a similar environment to
On Tue, 16 Jun 2020 at 11:53, Stéfane Fermigier sf@fermigier.com wrote: them. For example, many people work in environments where Internet access is tightly controlled one way or another. Not all countries or regions have reliable or widespread internet access. And some people simply need scripts to be easy to run, so "install Python and run this script" is as complex as it can get. Paul
Modularization that I suggest is just to separate modules and clean interfaces between them in such way it would be possible to use standard provided module when you download CPython or update this module from PyPi

Rhodri James wrote:
On 16/06/2020 10:23, redradist@gmail.com wrote:
I think it would be desired to modularize Python library and then to provide part of standard library through PyPi It will add possibility to evolve separately run-time and standard library Uh, aren't the runtime and standard library already separate creatures? As long as I cannot update version of standard library package separately from CPython version - No, they are not separate creatures ;)

On Tue, Jun 16, 2020 at 12:41:58PM -0000, redradist@gmail.com wrote:
As long as I cannot update version of standard library package separately from CPython version - No, they are not separate creatures ;)
Why would you want to? That just sounds like adding extra complexity and pain for no benefit. Instead of requirements: - requires Python 3.5 or better you have requirements: - Python 3.5 or better - math 2.7 or better - sys 2.1 or better - glob 5.9 or better - etc This does not seem like an improvement to me. I like going to StackOverflow, and if I read a solution or recipe that says "tested with Python 3.8" I know it will run in 3.8, without having to guess what the minimum requirements for each module are. Some of the Linux distros already split the stdlib into pieces. This is a real pain, especially for beginners. The process changes from: $ dnf install python3 # or apt-get or whatever package manager you use and everything documented at python.org Just Works straight out of the box, to a much more annoying process: $ dnf install python3 and then you have mysterious ImportErrors because some modules aren't installed, and you have to try to work out how to install them, and that's not an easy task: $ dnf search python3 | wc -l 3511 -- Steven

On Wed, Jun 17, 2020 at 10:07 AM Steven D'Aprano <steve@pearwood.info> wrote:
Some of the Linux distros already split the stdlib into pieces. This is a real pain, especially for beginners. The process changes from:
$ dnf install python3 # or apt-get or whatever package manager you use
and everything documented at python.org Just Works straight out of the box, to a much more annoying process:
$ dnf install python3
and then you have mysterious ImportErrors because some modules aren't installed, and you have to try to work out how to install them, and that's not an easy task:
$ dnf search python3 | wc -l 3511
On Debian, "apt install python3" gives you everything in the standard library, but you can "apt install python3-minimal" to get just part of it for a smaller installation with fewer dependencies. I believe this is the correct way to do things - the most obvious thing will indeed Just Work. If there are standard library modules that aren't installed, I think they get replaced with stubs, to ensure that the ImportErrors you get are at least informative. No idea what the Fedora folks do there. In any case, though, this kind of breaking up of the stdlib is carefully managed by the distribution. You don't get dependency hell because the packages are all synchronized. I do NOT want any sort of system where the stdlib can be updated on a separate schedule to the binary, because then you'd need exactly what Steven said with lots of version number requirements. ChrisA

Steven D'Aprano wrote:
As long as I cannot update version of standard library package separately from CPython version - No, they are not separate creatures ;) Why would you want to? That just sounds like adding extra complexity and
On Tue, Jun 16, 2020 at 12:41:58PM -0000, redradist@gmail.com wrote: pain for no benefit. Instead of requirements:
requires Python 3.5 or better
you have requirements:
Python 3.5 or better math 2.7 or better sys 2.1 or better glob 5.9 or better etc
This does not seem like an improvement to me. I like going to StackOverflow, and if I read a solution or recipe that says "tested with Python 3.8" I know it will run in 3.8, without having to guess what the minimum requirements for each module are. Some of the Linux distros already split the stdlib into pieces. This is a real pain, especially for beginners. The process changes from: $ dnf install python3 # or apt-get or whatever package manager you use
and everything documented at python.org Just Works straight out of the box, to a much more annoying process: $ dnf install python3
and then you have mysterious ImportErrors because some modules aren't installed, and you have to try to work out how to install them, and that's not an easy task: $ dnf search python3 | wc -l 3511
Why I want that ? Okay, here are the reasons: 1) Security issue, should be fixed as soon as possible without waiting 2 months or 1 year for next CPython release 2) Modules could be evolved independently that will allow to use some features of package earlier ... (crucial in our "Fast" World) 3) If library modularized I can remove parts of it on constrained environments (microcontroller) or in environments where we try to same each byte of disk memory Interfaces between modules would be thinner and visible that will allow to download as many packages as need for this module or library Modularized library will have two versions (run-time and itself version)

On Mon, Jun 29, 2020 at 10:20:40AM -0000, redradist@gmail.com wrote:
Why I want that ? Okay, here are the reasons:
1) Security issue, should be fixed as soon as possible without waiting 2 months or 1 year for next CPython release
That is an excellent argument, but is that *our* responsibility? There are many third party distributors that bundle Python and can provide a much faster bug fix schedule, e.g. Anaconda, Red Hat, other Linux distributions. (Apologies if I have missed anyone.) Some of them have more resources in time, money and available manpower than we have. If you want security fixes faster than the Python-Devs are capable of releasing them, perhaps you ought to pay a third-party?
2) Modules could be evolved independently that will allow to use some features of package earlier ... (crucial in our "Fast" World)
You say "crucial", I say "terrible". Our "fast world" is not something we should be encouraging. It is bad for people and bad for technology. Libraries in the std lib should not be evolving fast. Stability is more important than rapid development, and if a library is so experimental that it needs rapid development, then it is too experimental to be in the std lib. Third-party libraries can evolve as fast or as slow as they want; the Python std lib is under tension between people who want faster evolution and people who want stability, and we have to balance those two desires. As a compromise between "change once a month" and "change once a decade", I think Python's release cycle is not too bad.
3) If library modularized I can remove parts of it on constrained environments (microcontroller) or in environments where we try to same each byte of disk memory Interfaces between modules would be thinner and visible that will allow to download as many packages as need for this module or library
You can already do that. There are at least two currently maintained Pythons for small systems, MicroPython and CircuitPython. There may be others. The question is not whether Python's standard library can be split up, but whether we should force it to be split up for *everyone*, making everyone's life more complicated in order to simplify the needs of a minority of developers. I have written a lot of code that has to run on older versions or installations without third-party libraries, so I have lots of feature-detection code: try: min([1], key=lambda x: None) except TypeError: # Current system is too old to support key functions. # Create our own basic version. ... At the beginning, it's lots of fun to come up with clever ways to detect features which might be missing, and then find a work around. But it gets tiresome and painful very quickly. It is much better to work with a known environment: if I am running in Python 3.9, then *all these libraries and features come in a bundle*. If everything could change independently, then we would need feature- detection and version checks everywhere. That is not enjoyable, and it increases the complexity for *everyone* even when they get no benefit. -- Steven

Steven D'Aprano wrote:
Why I want that ? Okay, here are the reasons: 1) Security issue, should be fixed as soon as possible without waiting 2 months or 1 year for next CPython release That is an excellent argument, but is that our responsibility? There are many third party distributors that bundle Python and can
2) Modules could be evolved independently that will allow to use some features of package earlier ... (crucial in our "Fast" World) You say "crucial", I say "terrible". Our "fast world" is not something we should be encouraging. It is bad for people and bad for technology. Libraries in the std lib should not be evolving fast. Stability is more important than rapid development, and if a library is so experimental
3) If library modularized I can remove parts of it on constrained environments (microcontroller) or in environments where we try to same each byte of disk memory Interfaces between modules would be thinner and visible that will allow to download as many packages as need for this module or library You can already do that. There are at least two currently maintained Pythons for small systems, MicroPython and CircuitPython. There may be others. The question is not whether Python's standard library can be split up, but whether we should force it to be split up for everyone, making everyone's life more complicated in order to simplify the needs of a minority of developers. I have written a lot of code that has to run on older versions or installations without third-party libraries, so I have lots of feature-detection code:
On Mon, Jun 29, 2020 at 10:20:40AM -0000, redradist@gmail.com wrote: provide a much faster bug fix schedule, e.g. Anaconda, Red Hat, other Linux distributions. (Apologies if I have missed anyone.) Some of them have more resources in time, money and available manpower than we have. If you want security fixes faster than the Python-Devs are capable of releasing them, perhaps you ought to pay a third-party? that it needs rapid development, then it is too experimental to be in the std lib. Third-party libraries can evolve as fast or as slow as they want; the Python std lib is under tension between people who want faster evolution and people who want stability, and we have to balance those two desires. As a compromise between "change once a month" and "change once a decade", I think Python's release cycle is not too bad. try: min([1], key=lambda x: None) except TypeError: # Current system is too old to support key functions. # Create our own basic version. ...
At the beginning, it's lots of fun to come up with clever ways to detect features which might be missing, and then find a work around. But it gets tiresome and painful very quickly. It is much better to work with a known environment: if I am running in Python 3.9, then all these libraries and features come in a bundle. If everything could change independently, then we would need feature- detection and version checks everywhere. That is not enjoyable, and it increases the complexity for everyone even when they get no benefit.
Seems like you did not get my point ... ( I do not ask to remove or replace Python standard library, but to split it on two versions (aka standard, like .NET Standard, Rust libcore and libstd) Modularization would be also nice, but it requires time ... Claim that all Python should at least support libcore (Core Library) make whole infrastructure more uniform libcore could be all packages written in python and that do not access os resource and essential for ecosystem libstd - all other standard module libcore and libstd it is small step to modularization and standartization whole Python ecosystem

On Tue, Jun 30, 2020 at 12:00 AM <redradist@gmail.com> wrote:
Steven D'Aprano wrote:
Why I want that ? Okay, here are the reasons: 1) Security issue, should be fixed as soon as possible without waiting 2 months or 1 year for next CPython release That is an excellent argument, but is that our responsibility? There are many third party distributors that bundle Python and can
2) Modules could be evolved independently that will allow to use some features of package earlier ... (crucial in our "Fast" World) You say "crucial", I say "terrible". Our "fast world" is not something we should be encouraging. It is bad for people and bad for technology. Libraries in the std lib should not be evolving fast. Stability is more important than rapid development, and if a library is so experimental
3) If library modularized I can remove parts of it on constrained environments (microcontroller) or in environments where we try to same each byte of disk memory Interfaces between modules would be thinner and visible that will allow to download as many packages as need for this module or library You can already do that. There are at least two currently maintained Pythons for small systems, MicroPython and CircuitPython. There may be others. The question is not whether Python's standard library can be split up, but whether we should force it to be split up for everyone, making everyone's life more complicated in order to simplify the needs of a minority of developers. I have written a lot of code that has to run on older versions or installations without third-party libraries, so I have lots of feature-detection code:
On Mon, Jun 29, 2020 at 10:20:40AM -0000, redradist@gmail.com wrote: provide a much faster bug fix schedule, e.g. Anaconda, Red Hat, other Linux distributions. (Apologies if I have missed anyone.) Some of them have more resources in time, money and available manpower than we have. If you want security fixes faster than the Python-Devs are capable of releasing them, perhaps you ought to pay a third-party? that it needs rapid development, then it is too experimental to be in the std lib. Third-party libraries can evolve as fast or as slow as they want; the Python std lib is under tension between people who want faster evolution and people who want stability, and we have to balance those two desires. As a compromise between "change once a month" and "change once a decade", I think Python's release cycle is not too bad. try: min([1], key=lambda x: None) except TypeError: # Current system is too old to support key functions. # Create our own basic version. ...
At the beginning, it's lots of fun to come up with clever ways to detect features which might be missing, and then find a work around. But it gets tiresome and painful very quickly. It is much better to work with a known environment: if I am running in Python 3.9, then all these libraries and features come in a bundle. If everything could change independently, then we would need feature- detection and version checks everywhere. That is not enjoyable, and it increases the complexity for everyone even when they get no benefit.
Seems like you did not get my point ... (
I do not ask to remove or replace Python standard library, but to split it on two versions (aka standard, like .NET Standard, Rust libcore and libstd)
Or perhaps you didn't get his point, which is that that would be a really bad thing to do. ChrisA

Rhodri James wrote:
On 16/06/2020 10:23, redradist@gmail.com wrote:
I think it would be desired to modularize Python library and then to provide part of standard library through PyPi It will add possibility to evolve separately run-time and standard library Uh, aren't the runtime and standard library already separate creatures?
As long as I cannot update separately `CPython` standard library and run-time itself - No they are not separate creatures ;)

Hello, On Tue, 16 Jun 2020 09:23:23 -0000 redradist@gmail.com wrote:
Hi all, me again ... )
I think it would be desired to modularize Python library and then to provide part of standard library through PyPi It will add possibility to evolve separately run-time and standard library
This is what Pycopy project (https://github.com/pfalcon/pycopy) does for several years (well, first few years were done in the scope of the MicroPython project). For example, https://pypi.org/project/pycopy-os/ is an "os" module, installable separately for projects/deployments which need it (i.e. which need CPython-compatible "os" in all its bloated glory). What Pycopy provides builtin is "uos" module, containing just a subset of CPython's "os" functions (with small extensions which allow to cover additional usecases), enough to write simple, but useful applications. [] -- Best regards, Paul mailto:pmiscml@gmail.com

I like what they `Pycopy` maintainers did !! I allow to evolve separately runt-time and standard library ;) Library maintainer could provide fixes to library faster than waiting one year release cycle of CPython
participants (9)
-
Chris Angelico
-
Edwin Zimmerman
-
Paul Moore
-
Paul Sokolovsky
-
redradist@gmail.com
-
Rhodri James
-
Stephen J. Turnbull
-
Steven D'Aprano
-
Stéfane Fermigier