Standard library separation from core (was Re: My initial thoughts on the steps/blockers of the transition)
On 5 January 2016 at 12:50, Nicholas Chammas
Something else to consider. We’ve long talked about splitting out the stdlib to make it easier for the alternative implementations to import. If some or all of them also switch to git, we could do that pretty easily with git submodules.
Not to derail here, but wasn’t there a discussion (perhaps on python-ideas) about slowly moving to a model where we distribute a barebones Python “core”, allowing the standard modules to be updated and released on a more frequent cycle? Would this be one small step towards such a model?
That discussion has been going on for years :) The most extensive elaboration is in the related PEPs: PEP 407 considered the idea of distinguishing normal releases and LTS releases: https://www.python.org/dev/peps/pep-0407/ PEP 413 considered decoupling standard library versions from language versions: https://www.python.org/dev/peps/pep-0413/ The ripple effect of either proposal on the wider community would have been huge though, hence why 407 is Deferred and 413 Withdrawn. Instead, the main step which has been taken (driven in no small part by the Python 3 transition) is the creation of PyPI counterparts for modules that see substantial updates that are backwards compatible with earlier versions (importlib2, for example, lets you use the Python 3 import system in Python 2). Shipping pip by default with the interpreter runtime is also pushing people more towards the notion that "if you're limiting yourself to the standard library, you're experiencing only a fraction of what the Python ecosystem has to offer you". We don't currently do a great job of making those libraries *discoverable* by end users, but they're available if you know to look for them (there's an incomplete list at https://wiki.python.org/moin/Python2orPython3#Supporting_Python_2_and_Python... ) pip's inclusion was also the first instance of CPython shipping a *bundled* library that isn't maintained through the CPython development process - each new maintenance release of CPython ships the latest upstream version of pip, rather than being locked to the version of pip that shipped with the corresponding x.y.0 release. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
Thanks for sharing that background, Nick.
Instead, the main step which has been taken (driven in no small part
by the Python 3 transition) is the creation of PyPI counterparts for
modules that see substantial updates that are backwards compatible
with earlier versions (importlib2, for example, lets you use the
Python 3 import system in Python 2).
So is the intention that, over the long term, these PyPI counterparts would
cannibalize their standard library equivalents in terms of usage?
Nick
On Mon, Jan 4, 2016 at 10:38 PM Nick Coghlan
Something else to consider. We’ve long talked about splitting out the stdlib to make it easier for the alternative implementations to import. If some or all of them also switch to git, we could do that pretty easily with git submodules.
Not to derail here, but wasn’t there a discussion (perhaps on
On 5 January 2016 at 12:50, Nicholas Chammas
wrote: python-ideas) about slowly moving to a model where we distribute a barebones Python “core”, allowing the standard modules to be updated and released on a more frequent cycle? Would this be one small step towards such a model?
That discussion has been going on for years :)
The most extensive elaboration is in the related PEPs:
PEP 407 considered the idea of distinguishing normal releases and LTS releases: https://www.python.org/dev/peps/pep-0407/ PEP 413 considered decoupling standard library versions from language versions: https://www.python.org/dev/peps/pep-0413/
The ripple effect of either proposal on the wider community would have been huge though, hence why 407 is Deferred and 413 Withdrawn.
Instead, the main step which has been taken (driven in no small part by the Python 3 transition) is the creation of PyPI counterparts for modules that see substantial updates that are backwards compatible with earlier versions (importlib2, for example, lets you use the Python 3 import system in Python 2). Shipping pip by default with the interpreter runtime is also pushing people more towards the notion that "if you're limiting yourself to the standard library, you're experiencing only a fraction of what the Python ecosystem has to offer you".
We don't currently do a great job of making those libraries *discoverable* by end users, but they're available if you know to look for them (there's an incomplete list at
https://wiki.python.org/moin/Python2orPython3#Supporting_Python_2_and_Python... )
pip's inclusion was also the first instance of CPython shipping a *bundled* library that isn't maintained through the CPython development process - each new maintenance release of CPython ships the latest upstream version of pip, rather than being locked to the version of pip that shipped with the corresponding x.y.0 release.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On 5 January 2016 at 14:14, Nicholas Chammas
Thanks for sharing that background, Nick.
Instead, the main step which has been taken (driven in no small part by the Python 3 transition) is the creation of PyPI counterparts for modules that see substantial updates that are backwards compatible with earlier versions (importlib2, for example, lets you use the Python 3 import system in Python 2).
So is the intention that, over the long term, these PyPI counterparts would cannibalize their standard library equivalents in terms of usage?
Probably not - the baseline versions will almost certainly always be used more heavily simply due to being available by default. What the PyPI releases mean is that the folks for whom the standard library version is old enough to be annoying now have the freedom to choose between selectively updating just that component and upgrading to a new version of the language runtime, and the former is important when you don't have full control over the target runtime environment (e.g. many folks are paid to support the system Python runtimes on various versions of Linux, and only drop support for those old versions when the Linux vendors do). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Mon, 4 Jan 2016 at 21:22 Nick Coghlan
On 5 January 2016 at 14:14, Nicholas Chammas
wrote: Thanks for sharing that background, Nick.
Instead, the main step which has been taken (driven in no small part by the Python 3 transition) is the creation of PyPI counterparts for modules that see substantial updates that are backwards compatible with earlier versions (importlib2, for example, lets you use the Python 3 import system in Python 2).
So is the intention that, over the long term, these PyPI counterparts would cannibalize their standard library equivalents in terms of usage?
Probably not - the baseline versions will almost certainly always be used more heavily simply due to being available by default.
What the PyPI releases mean is that the folks for whom the standard library version is old enough to be annoying now have the freedom to choose between selectively updating just that component and upgrading to a new version of the language runtime, and the former is important when you don't have full control over the target runtime environment (e.g. many folks are paid to support the system Python runtimes on various versions of Linux, and only drop support for those old versions when the Linux vendors do).
If you guys wants to continue this conversation, the stdlib-sig is the perfect place to have this discussion.
participants (3)
-
Brett Cannon
-
Nicholas Chammas
-
Nick Coghlan