Why is pickle.DEFAULT_PROTOCOL still 3?
Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default. There's a number of things that would run faster with it like multiprocessing. This is too late for 3.7 which is a shame but can we at least bump it for 3.8? - Ł
On Mon, 2 Apr 2018 13:48:46 -0700
Lukasz Langa
Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default.
Because we want pickles produced with the default to be readable by earlier Python 3 versions. (the same reason protocol 0 stayed the default throughout the Python 2 lifetime) Regards Antoine.
On Apr 2, 2018, at 2:13 PM, Antoine Pitrou
wrote: On Mon, 2 Apr 2018 13:48:46 -0700 Lukasz Langa
wrote: Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default.
Because we want pickles produced with the default to be readable by earlier Python 3 versions. (the same reason protocol 0 stayed the default throughout the Python 2 lifetime)
Alright, so that means we can easily do this for Python 3.8, right? I mean, following Christian's logic, Python 3.3 is already dead, with its final release done in February 2016 and support dropped in September 2017 per PEP 398. I think we need to get past thinking about "Python 2" vs. "Python 3". This frame of mind creates space for another mythical release of Python that will break all the compatibilities, something we promised not to do. A moving backward compatibility window that includes the last release still under security fixes seems like a good new framework for this. What do you think? - Ł
On Mon, Apr 2, 2018 at 3:57 PM Lukasz Langa
On Apr 2, 2018, at 2:13 PM, Antoine Pitrou
wrote: On Mon, 2 Apr 2018 13:48:46 -0700 Lukasz Langa
wrote: Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default.
Because we want pickles produced with the default to be readable by earlier Python 3 versions. (the same reason protocol 0 stayed the default throughout the Python 2 lifetime)
Alright, so that means we can easily do this for Python 3.8, right? I mean, following Christian's logic, Python 3.3 is already dead, with its final release done in February 2016 and support dropped in September 2017 per PEP 398.
I think we need to get past thinking about "Python 2" vs. "Python 3". This frame of mind creates space for another mythical release of Python that will break all the compatibilities, something we promised not to do. A moving backward compatibility window that includes the last release still under security fixes seems like a good new framework for this.
What do you think?
+1 - That puts the words to the reason I suggest just running with the change for 3.8. If we had noticed in time I'd've suggested doing it in 3.7, too late and not a huge deal so just wait for 3.8. Never changing it in the past during 1 and 2.x led a lot of code to always specify HIGHEST as the protocol because the default was unreasonable. -gps
On Mon, 2 Apr 2018 15:57:11 -0700
Lukasz Langa
On Apr 2, 2018, at 2:13 PM, Antoine Pitrou
wrote: On Mon, 2 Apr 2018 13:48:46 -0700 Lukasz Langa
wrote: Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default.
Because we want pickles produced with the default to be readable by earlier Python 3 versions. (the same reason protocol 0 stayed the default throughout the Python 2 lifetime)
Alright, so that means we can easily do this for Python 3.8, right? I mean, following Christian's logic, Python 3.3 is already dead, with its final release done in February 2016 and support dropped in September 2017 per PEP 398.
I think we need to get past thinking about "Python 2" vs. "Python 3". This frame of mind creates space for another mythical release of Python that will break all the compatibilities, something we promised not to do. A moving backward compatibility window that includes the last release still under security fixes seems like a good new framework for this.
What do you think?
That sounds reasonable to me. Let's see whether other people disagree. Regards Antoine.
03.04.18 01:57, Lukasz Langa пише:
Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default. Because we want pickles produced with the default to be readable by earlier Python 3 versions. (the same reason protocol 0 stayed the default throughout the Python 2
On Apr 2, 2018, at 2:13 PM, Antoine Pitrou
wrote: On Mon, 2 Apr 2018 13:48:46 -0700 Lukasz Langa wrote: lifetime) Alright, so that means we can easily do this for Python 3.8, right? I mean, following Christian's logic, Python 3.3 is already dead, with its final release done in February 2016 and support dropped in September 2017 per PEP 398.
I think we need to get past thinking about "Python 2" vs. "Python 3". This frame of mind creates space for another mythical release of Python that will break all the compatibilities, something we promised not to do. A moving backward compatibility window that includes the last release still under security fixes seems like a good new framework for this.
What do you think?
The only possible drawback of protocol 4 is that very short pickles can be longer than with protocol 3 due to additional 9 bytes for the FRAME header and less compact pickling of globals. This may be partially compensated by implementing additional optimizations and/or passing short pickles through pickletools.optimize().
On Tue, 3 Apr 2018 at 01:19 Serhiy Storchaka
Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default. Because we want pickles produced with the default to be readable by earlier Python 3 versions. (the same reason protocol 0 stayed the default throughout the Python 2
On Apr 2, 2018, at 2:13 PM, Antoine Pitrou
wrote: On Mon, 2 Apr 2018 13:48:46 -0700 Lukasz Langa wrote: lifetime) Alright, so that means we can easily do this for Python 3.8, right? I mean, following Christian's logic, Python 3.3 is already dead, with its final release done in February 2016 and support dropped in September 2017
03.04.18 01:57, Lukasz Langa пише: per PEP 398.
I think we need to get past thinking about "Python 2" vs. "Python 3".
This frame of mind creates space for another mythical release of Python that will break all the compatibilities, something we promised not to do. A moving backward compatibility window that includes the last release still under security fixes seems like a good new framework for this.
What do you think?
The only possible drawback of protocol 4 is that very short pickles can be longer than with protocol 3 due to additional 9 bytes for the FRAME header and less compact pickling of globals.
This may be partially compensated by implementing additional optimizations and/or passing short pickles through pickletools.optimize().
I think if you're that worried about specifics to that detail then you should be specifying the protocol level manually anyway. I view this like something in the peepholer: a freebie perf bump for those that aren't too worried about such things. :)
Breaking this off from the pickle thread because it seems unrelated: On 04/02/2018 06:57 PM, Lukasz Langa wrote:
I think we need to get past thinking about "Python 2" vs. "Python 3". This frame of mind creates space for another mythical release of Python that will break all the compatibilities, something we promised not to do. A moving backward compatibility window that includes the last release still under security fixes seems like a good new framework for this.
Maybe this has already been discussed ad nauseum, but is the idea here that Python will stay on Python 3.x, but also start breaking backwards compatibility with old versions? That would seem to be a violation of semantic versioning. I think if this is going to happen, it should either be that the major version number gets bumped with every compatibility-breaking release, or Python should switch to some form of calendrical versioning (which may amount to more or less the same thing, but be different enough that people won't freak out when you say Python 4 is coming): https://calver.org Switching to CalVer is a pretty clear sign that there is now a "rolling backwards compatibility window", and it allows Python to skip right over the mythical "Python 4" and directly to "Python 21". Additionally, since the version number will be trivially predictable, deprecation warnings can actually include the version after which they will be dropped - so if a feature is slated to be removed 5 years after it is initially deprecated, just take the deprecation release version and add 5.
On 3 April 2018 at 13:51, Paul G
Maybe this has already been discussed ad nauseum, but is the idea here that Python will stay on Python 3.x, but also start breaking backwards compatibility with old versions? That would seem to be a violation of semantic versioning.
Python's versions don't follow strict semantic versioning. See https://docs.python.org/3/faq/general.html#how-does-the-python-version-numbe... Paul
That documentation seems like a "layman's explanation" of how semantic versioning works. I suspect anyone familiar with semantic versioning will read that and think, "Ah, yes, this is a semantic versioning scheme." Regardless of the semantics (har har) of whether Python "follows strict semantic versioning", a change to the versioning scheme (CalVer should be backwards compatible with SemVer, mind you, since (21, 0, 0) > (3, 8, 0)) should make it absolutely clear that Python is not following SemVer. Counter-intuitively, I think the *fact* of pinning the major version number to 3 is a worse signal of "we're not going to break everything" than switching to CalVer could. By switching to CalVer, you are removing the *ability* to signal a discontinuous major breaking change just by the version number. It is very much a "burn your boats so you can't retreat" philosophy to versioning. Of course, if we want to reserve the ability to have sudden and major breaking changes, then yes, sticking with the current semi-SemVer system is fine, but I suspect that the fact that minor releases can break backwards compatibility will be confusing and annoying for most people (not me, because I know about it and I test against nightly), and as long as there's a "3" in the "major" slot, people will speculate about the possibility of a "4". On 04/03/2018 09:07 AM, Paul Moore wrote:
On 3 April 2018 at 13:51, Paul G
wrote: Maybe this has already been discussed ad nauseum, but is the idea here that Python will stay on Python 3.x, but also start breaking backwards compatibility with old versions? That would seem to be a violation of semantic versioning.
Python's versions don't follow strict semantic versioning. See https://docs.python.org/3/faq/general.html#how-does-the-python-version-numbe...
Paul
On 3 April 2018 at 23:24, Paul G
That documentation seems like a "layman's explanation" of how semantic versioning works. I suspect anyone familiar with semantic versioning will read that and think, "Ah, yes, this is a semantic versioning scheme."
Anyone that reads the porting section in one of our What's New documents will quickly learn that we *don't* use semantic versioning - we use rolling deprecation windows, and have done so for decades :)
Regardless of the semantics (har har) of whether Python "follows strict semantic versioning", a change to the versioning scheme (CalVer should be backwards compatible with SemVer, mind you, since (21, 0, 0) > (3, 8, 0)) should make it absolutely clear that Python is not following SemVer.
Counter-intuitively, I think the *fact* of pinning the major version number to 3 is a worse signal of "we're not going to break everything" than switching to CalVer could. By switching to CalVer, you are removing the *ability* to signal a discontinuous major breaking change just by the version number. It is very much a "burn your boats so you can't retreat" philosophy to versioning.
The reason for sticking with 3.x for a while is because of the corner \*nix systems have gotten stuck into regarding the "python" symlink, and the fact it currently still points to "python2" (if it exists at all). Once we've updated PEP 394 to recommend pointing it at Python 3 (which is currently looking like it might happen around the 3.8 or 3.9 time frame), then we can start talking about instead pointing it at python 4.x, and making "python3" just be a confusingly named symlink to python4.x. That consideration applies regardless of whether the version change following the last 3.x release is a simple major version increment to python 4.x or to something CalVer based like python 22.x
Of course, if we want to reserve the ability to have sudden and major breaking changes, then yes, sticking with the current semi-SemVer system is fine, but I suspect that the fact that minor releases can break backwards compatibility will be confusing and annoying for most people (not me, because I know about it and I test against nightly), and as long as there's a "3" in the "major" slot, people will speculate about the possibility of a "4".
Indeed, but the details of Python's version numbering scheme have a *lot* of backwards compatibility implications, as even we haven't always been diligent about including separators between the segments of the version number in various use cases (e.g. wheel compatibility tags allow the use of underscores for disambiguation, but I'd bet at least some code doesn't currently handle that correctly). That means going to a scheme like "22.x" would risk emitting version number that sort lexically lower than "27" in some contexts. Since it's a "not to be resolved until after 3.9" problem regardless, we have time to consider how we want to handle it :) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On 04/03/2018 10:10 AM, Nick Coghlan wrote:
The reason for sticking with 3.x for a while is because of the corner \*nix systems have gotten stuck into regarding the "python" symlink, and the fact it currently still points to "python2" (if it exists at all). Once we've updated PEP 394 to recommend pointing it at Python 3 (which is currently looking like it might happen around the 3.8 or 3.9 time frame), then we can start talking about instead pointing it at python 4.x, and making "python3" just be a confusingly named symlink to python4.x.
I was definitely not suggesting this before the end of the Python 2 era, but I don't think Lukasz was suggesting this either. I'm suggesting it specifically for the period of time during which Python is using the "rolling backwards compatibility guarantee" standard laid out in the original posting. I would think in terms of evaluating this proposal, it is (in order) worth considering: 1. Is it a desirable goal 2. Is it an achievable goal? 3. How specifically can it be achieved? 4. What is the implementation timeline? If #1 and #2 are true (which is my contention), *then* the specific details can be ironed out about how and when. Ideally, if it is desirable and achievable, it might be prudent to wait to start the "rolling backwards compatibility guarantee" era as part of the new versioning scheme rollout, rather than updating the versioning scheme post-hoc to reflect the new status quo.
On Tue, Apr 3, 2018 at 5:51 AM, Paul G
Maybe this has already been discussed ad nauseum, but is the idea here that Python will stay on Python 3.x, but also start breaking backwards compatibility with old versions? That would seem to be a violation of semantic versioning.
Python's versioning has worked this way since well before "semantic versioning" was invented... so I think it's more fair to say semantic versioning is a violation of Python ;-). CalVer does have the nice properties that it makes a public commitment that there will never be another "x.0"-style release, and that it encodes some useful information into the version (maybe it would be slightly easier to convince people to stop using 2.7 if it was "Python 2010"). But the current scheme is so ingrained in the culture at this point that I doubt there's much appetite for fiddling with it... -n -- Nathaniel J. Smith -- https://vorpus.org
On Tue, Apr 3, 2018 at 10:51 PM, Paul G
Breaking this off from the pickle thread because it seems unrelated:
On 04/02/2018 06:57 PM, Lukasz Langa wrote:
I think we need to get past thinking about "Python 2" vs. "Python 3". This frame of mind creates space for another mythical release of Python that will break all the compatibilities, something we promised not to do. A moving backward compatibility window that includes the last release still under security fixes seems like a good new framework for this.
Maybe this has already been discussed ad nauseum, but is the idea here that Python will stay on Python 3.x, but also start breaking backwards compatibility with old versions? That would seem to be a violation of semantic versioning.
Compatibility can be broken in small ways by minor versions. For instance, if you have a function "def await(x):", that will work perfectly on older Pythons, but will fail now that 'await' is a keyword. Does that require a major version bump? The pickle situation was one of changing a default, and if you need compatibility with ancient versions, you can specify protocol 0. By maintaining a sane default, we encourage people to use that rather than specifying protocol=-1 and breaking compatibility *immediately* when a new protocol is created. The expectation is that a minor version bump generally won't break things, but you test your code anyway, in case it does. (A revision release should never break anything that wasn't already broken, eg depending on a bug.) A change to the default pickle protocol seems like a fairly safe change to me, since it's so easy to override if you need to. When programs use calendar-based versioning, I'm left with no information as to whether it's breaking changes or not. In fact, it might as well have no version numbers whatsoever. If I care about backward compatibility, I just have to stick with the exact same unpatched version that I had before. Not a good thing for a language that releases periodic bugfix updates to older versions. ChrisA
When programs use calendar-based versioning, I'm left with no information as to whether it's breaking changes or not. In fact, it might as well have no version numbers whatsoever. If I care about backward compatibility, I just have to stick with the exact same unpatched version that I had before. Not a good thing for a language that releases periodic bugfix updates to older versions.
Well, this is not true. As a general rule, you don't know if anything *you care about* breaks at a given version change anyway, and I've seen plenty of handwringing about what should and should not be considered a breaking change. A major change is basically the "nuclear option", and everything I've seen from core developers is "there will never be another major change on the scale of 2->3". Given that from an optics perspective there's no interest in creating a Python 4 and it's been repeatedly emphasized in this thread that Python doesn't use semantic versioning, I don't see how you can really say that the version number gives you much information at all about what will and will not break. This thread started in response to a proposal to just have a rolling backwards compatibility window (e.g. no discontinuous major version changes), which is basically ideal for calendar versioning - any time you make a backwards incompatible change, you give X notice period. You can encode X directly into the deprecation warnings (e.g. "This feature will be removed in Python >= 2025"), that way users will know on a *per feature basis* where to pin, even before specific version numbers are announced. Presumably if major breaking changes ever *do* occur, the mechanism for upgrading Python would be either to create a new program called something else (e.g. the core Python simply *does not break backwards compat*), or a slow rollout with individual feature flags that start out defaulting on (opt in to the new behavior), eventually default to off (opt out of the new behavior), and then are removed after a very long deprecation period.
On Tue, 3 Apr 2018 at 07:39 Paul G
When programs use calendar-based versioning, I'm left with no information as to whether it's breaking changes or not. In fact, it might as well have no version numbers whatsoever. If I care about backward compatibility, I just have to stick with the exact same unpatched version that I had before. Not a good thing for a language that releases periodic bugfix updates to older versions.
Well, this is not true. As a general rule, you don't know if anything *you care about* breaks at a given version change anyway, and I've seen plenty of handwringing about what should and should not be considered a breaking change. A major change is basically the "nuclear option", and everything I've seen from core developers is "there will never be another major change on the scale of 2->3". Given that from an optics perspective there's no interest in creating a Python 4 and it's been repeatedly emphasized in this thread that Python doesn't use semantic versioning, I don't see how you can really say that the version number gives you much information at all about what will and will not break.
Paul's point is that he knows e.g. code working in 3.6.0 will work when he upgrades to 3.6.5, and if his code is warning-free and works with all __future__ statements in 3.6 that it will work fine in 3.7. With CalVer you could make a similar promise if you keep the major version the year of release and then keep our feature/bugfix number promise like we already have, but at that point who cares about the year?
This thread started in response to a proposal to just have a rolling backwards compatibility window (e.g. no discontinuous major version changes), which is basically ideal for calendar versioning - any time you make a backwards incompatible change, you give X notice period. You can encode X directly into the deprecation warnings (e.g. "This feature will be removed in Python >= 2025"), that way users will know on a *per feature basis* where to pin, even before specific version numbers are announced.
While we have an 18 month schedule for releases, we also can't guarantee we will hit our release window in a specific year, e.g. 3.6 was first released in December but we could have slipped into January. That makes promises for a specific version tied to a year number problematic.
Presumably if major breaking changes ever *do* occur, the mechanism for upgrading Python would be either to create a new program called something else (e.g. the core Python simply *does not break backwards compat*), or a slow rollout with individual feature flags that start out defaulting on (opt in to the new behavior), eventually default to off (opt out of the new behavior), and then are removed after a very long deprecation period.
We already have this with __future__ statements. The transition from 2 to 3 was such a big deal because we changed so much without using __future__ statements. We've already said that going forward we are not skipping that step for any other releases. If we chose to switch to semantic versioning, we would probably make it so that any version that makes a __future__ statement the new default is a major version bump. But then the issue becomes the stdlib. Do we drop everything that was previously deprecated in the stdlib for every major version bump? That might be a bit drastic when e.g. all you did was make old code that used `await` as a variable name raise a SyntaxError. And saying "deprecated for two versions" then becomes messy because you have no idea what that second version will be. And what if you **really** want to get rid of something in the next release? Can a single module dropping a function force a version bump for Python itself? I have not read the other thread, but knowing it's from Lukasz makes me guess he wants to move up the minor number to become the major one in terms of our current semantics so we have a rolling window of support. That would mean that as long as you are warnings-free and run fine with all __future__ statements turned on for version N then N+1 should work for you without modification (sans relying on buggy semantics). That approach would deal with the above issues cleanly while dropping what our current major number semantics which some people view as a fallacy and are getting upset over (I'm personally +0 on this last idea as it's easy to explain to folks and I have been doing Python version explanations a lot over the last several months).
I personally see no reason to change anything.
On Tue, Apr 3, 2018 at 9:36 AM, Brett Cannon
On Tue, 3 Apr 2018 at 07:39 Paul G
wrote: When programs use calendar-based versioning, I'm left with no information as to whether it's breaking changes or not. In fact, it might as well have no version numbers whatsoever. If I care about backward compatibility, I just have to stick with the exact same unpatched version that I had before. Not a good thing for a language that releases periodic bugfix updates to older versions.
Well, this is not true. As a general rule, you don't know if anything *you care about* breaks at a given version change anyway, and I've seen plenty of handwringing about what should and should not be considered a breaking change. A major change is basically the "nuclear option", and everything I've seen from core developers is "there will never be another major change on the scale of 2->3". Given that from an optics perspective there's no interest in creating a Python 4 and it's been repeatedly emphasized in this thread that Python doesn't use semantic versioning, I don't see how you can really say that the version number gives you much information at all about what will and will not break.
Paul's point is that he knows e.g. code working in 3.6.0 will work when he upgrades to 3.6.5, and if his code is warning-free and works with all __future__ statements in 3.6 that it will work fine in 3.7. With CalVer you could make a similar promise if you keep the major version the year of release and then keep our feature/bugfix number promise like we already have, but at that point who cares about the year?
This thread started in response to a proposal to just have a rolling backwards compatibility window (e.g. no discontinuous major version changes), which is basically ideal for calendar versioning - any time you make a backwards incompatible change, you give X notice period. You can encode X directly into the deprecation warnings (e.g. "This feature will be removed in Python >= 2025"), that way users will know on a *per feature basis* where to pin, even before specific version numbers are announced.
While we have an 18 month schedule for releases, we also can't guarantee we will hit our release window in a specific year, e.g. 3.6 was first released in December but we could have slipped into January. That makes promises for a specific version tied to a year number problematic.
Presumably if major breaking changes ever *do* occur, the mechanism for upgrading Python would be either to create a new program called something else (e.g. the core Python simply *does not break backwards compat*), or a slow rollout with individual feature flags that start out defaulting on (opt in to the new behavior), eventually default to off (opt out of the new behavior), and then are removed after a very long deprecation period.
We already have this with __future__ statements. The transition from 2 to 3 was such a big deal because we changed so much without using __future__ statements. We've already said that going forward we are not skipping that step for any other releases.
If we chose to switch to semantic versioning, we would probably make it so that any version that makes a __future__ statement the new default is a major version bump. But then the issue becomes the stdlib. Do we drop everything that was previously deprecated in the stdlib for every major version bump? That might be a bit drastic when e.g. all you did was make old code that used `await` as a variable name raise a SyntaxError. And saying "deprecated for two versions" then becomes messy because you have no idea what that second version will be. And what if you **really** want to get rid of something in the next release? Can a single module dropping a function force a version bump for Python itself?
I have not read the other thread, but knowing it's from Lukasz makes me guess he wants to move up the minor number to become the major one in terms of our current semantics so we have a rolling window of support. That would mean that as long as you are warnings-free and run fine with all __future__ statements turned on for version N then N+1 should work for you without modification (sans relying on buggy semantics). That approach would deal with the above issues cleanly while dropping what our current major number semantics which some people view as a fallacy and are getting upset over (I'm personally +0 on this last idea as it's easy to explain to folks and I have been doing Python version explanations a lot over the last several months).
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/ guido%40python.org
-- --Guido van Rossum (python.org/~guido)
On 04/03/2018 12:36 PM, Brett Cannon wrote:
On Tue, 3 Apr 2018 at 07:39 Paul G
wrote:
Paul's point is that he knows e.g. code working in 3.6.0 will work when he upgrades to 3.6.5, and if his code is warning-free and works with all __future__ statements in 3.6 that it will work fine in 3.7. With CalVer you could make a similar promise if you keep the major version the year of release and then keep our feature/bugfix number promise like we already have, but at that point who cares about the year?
I think it is reasonable to use a scheme like this: YY.MM.patch The canonical release is numbered by Year-Month, and patches intended to add new features and break nothing get a patch release, no matter when they are released. That allows for simple pinning.
While we have an 18 month schedule for releases, we also can't guarantee we will hit our release window in a specific year, e.g. 3.6 was first released in December but we could have slipped into January. That makes promises for a specific version tied to a year number problematic.
I think no promises for a specific version number need to be made until *after* a release is made. Using calendar-based versioning facilitates using a calendar-based deprecation schedule, irrespective of how many releases have occurred between the deprecation and the removal. If the promise is "we will support this feature for 5 years", then you know ahead of time the minimum version number in which that feature will be removed. So say you have a function foo() called in your library that is deprecated in the 20.05.0 feature release. You know at that point that any release *after* 25.05.0 will drop support for foo, so on your next release that uses foo(), you set the upper bound on your `requires_python` to `<25.05.0`. It doesn't matter if that release happens in 2025-08 or 2027-01. Similarly if you are developing some cutting edge library that is using features from the "feature release" branch, you can set `requires_python` to some value with a month greater than the current calendar version.
If we chose to switch to semantic versioning, we would probably make it so that any version that makes a __future__ statement the new default is a major version bump. But then the issue becomes the stdlib. Do we drop everything that was previously deprecated in the stdlib for every major version bump? That might be a bit drastic when e.g. all you did was make old code that used `await` as a variable name raise a SyntaxError. And saying "deprecated for two versions" then becomes messy because you have no idea what that second version will be. And what if you **really** want to get rid of something in the next release? Can a single module dropping a function force a version bump for Python itself?
This is why I suggested a calendar-version based approach, along with a pre-commitment that the "rolling backwards compatibility window" would be a specific amount of *time*, not a specific number of versions. It makes it very easy to make predicitions about when specific features are falling in and out of that window, because each deprecation warning can come *pre-populated* with the length of the deprecation period and, as a result, the maximum version that will support that feature. If that were done, you could even concievably write a "compatibility checker" tool that can be run against a repo to calculate the `requires_python` upper bound and to order deprecation warnings by urgency.
I have not read the other thread, but knowing it's from Lukasz makes me guess he wants to move up the minor number to become the major one in terms of our current semantics so we have a rolling window of support. That would mean that as long as you are warnings-free and run fine with all __future__ statements turned on for version N then N+1 should work for you without modification (sans relying on buggy semantics). That approach would deal with the above issues cleanly while dropping what our current major number semantics which some people view as a fallacy and are getting upset over (I'm personally +0 on this last idea as it's easy to explain to folks and I have been doing Python version explanations a lot over the last several months).
Lukasz did not propose any changes to the version number semantics, just the deprecation schedule. I was suggesting that if this happens, it's probably a good idea to "drop" the major version number - either by incrementing it with each release that breaks backwards compat (which, for a sufficiently continuous influx of rolling deprecations should be basically every feature release), or by switching to a calendar-based system. From above, I think a calendar-based system has the best features for end users.
On 2018-04-03 18:09, Paul G wrote:
On 04/03/2018 12:36 PM, Brett Cannon wrote:
On Tue, 3 Apr 2018 at 07:39 Paul G
wrote: Paul's point is that he knows e.g. code working in 3.6.0 will work when he upgrades to 3.6.5, and if his code is warning-free and works with all __future__ statements in 3.6 that it will work fine in 3.7. With CalVer you could make a similar promise if you keep the major version the year of release and then keep our feature/bugfix number promise like we already have, but at that point who cares about the year?
I think it is reasonable to use a scheme like this:
YY.MM.patch
Surely that should be: YYYY.MM.patch [snip]
On Apr 3, 2018, at 05:51, Paul G
Switching to CalVer is a pretty clear sign that there is now a "rolling backwards compatibility window", and it allows Python to skip right over the mythical "Python 4" and directly to "Python 21". Additionally, since the version number will be trivially predictable, deprecation warnings can actually include the version after which they will be dropped - so if a feature is slated to be removed 5 years after it is initially deprecated, just take the deprecation release version and add 5.
Changing the versioning scheme is a topic that comes up every now and then, and I think it’s worth exploring, but I also don’t think we should do anything about it until the EOL of Python 2.7 at the earliest (if ever). That said, and assuming we keep the current scheme, I think there’s a natural place to label “Python 4” - we have to break the C API to get rid of the GIL and/or adopt GC over refcounting, or something of that nature. I think there is no interest or appetite for a Python source level breaking change like 2->3, so I wouldn’t expect anything more than the usual changes at the Python source level for 3.x->4. Of course, you’d like to mitigate the breakage of extension modules as much as possible, but should that be unavoidable, then it would warrant a first digit version bump. OTOH, some calver-like scheme would be interesting too if we shorten the release cycle. Could we release a new Python version every 12 months instead of 18? Can we adopt a time-based release policy, so that whatever gets in, gets in, and the rest has to wait until the next release? That’s certainly more painful when that wait is 18 months rather than something shorter. But if releases come more quickly, that has implications for the deprecation policy too. And it puts pressure on the second digit because something like Python 3.53 is distasteful (especially because it would be easily confused with 3.5.3). Python 21.12 anyone? :) Cheers, -Barry
On Tue, 3 Apr 2018 at 11:18 Barry Warsaw
On Apr 3, 2018, at 05:51, Paul G
wrote: Switching to CalVer is a pretty clear sign that there is now a "rolling backwards compatibility window", and it allows Python to skip right over the mythical "Python 4" and directly to "Python 21". Additionally, since the version number will be trivially predictable, deprecation warnings can actually include the version after which they will be dropped - so if a feature is slated to be removed 5 years after it is initially deprecated, just take the deprecation release version and add 5.
Changing the versioning scheme is a topic that comes up every now and then, and I think it’s worth exploring, but I also don’t think we should do anything about it until the EOL of Python 2.7 at the earliest (if ever).
That said, and assuming we keep the current scheme, I think there’s a natural place to label “Python 4” - we have to break the C API to get rid of the GIL and/or adopt GC over refcounting, or something of that nature. I think there is no interest or appetite for a Python source level breaking change like 2->3, so I wouldn’t expect anything more than the usual changes at the Python source level for 3.x->4. Of course, you’d like to mitigate the breakage of extension modules as much as possible, but should that be unavoidable, then it would warrant a first digit version bump.
OTOH, some calver-like scheme would be interesting too if we shorten the release cycle. Could we release a new Python version every 12 months instead of 18? Can we adopt a time-based release policy, so that whatever gets in, gets in, and the rest has to wait until the next release? That’s certainly more painful when that wait is 18 months rather than something shorter. But if releases come more quickly, that has implications for the deprecation policy too. And it puts pressure on the second digit because something like Python 3.53 is distasteful (especially because it would be easily confused with 3.5.3). Python 21.12 anyone? :)
Are we at the PEP/language summit topic point yet in this discussion since Guido has said he's not interested in changing the status quo? ;) Versioning is like naming variables, so this thread could go on forever.
On Apr 3, 2018, at 13:08, Brett Cannon
Are we at the PEP/language summit topic point yet in this discussion since Guido has said he's not interested in changing the status quo? ;) Versioning is like naming variables, so this thread could go on forever.
Yeah probably so. And if you count the multiyear hiatus between reboots on this topic, yes it will likely go on forever. :) -Barry
On 04/03/2018 01:16 PM, Barry Warsaw wrote:
On Apr 3, 2018, at 13:08, Brett Cannon wrote:
Are we at the PEP/language summit topic point yet in this discussion since Guido has said he's not interested in changing the status quo? ;) Versioning is like naming variables, so this thread could go on forever.
Yeah probably so. And if you count the multiyear hiatus between reboots on this topic, yes it will likely go on forever. :)
Ah, so someone (Chris? ;) needs to write a PEP so it can be rejected, then? -- ~Ethan~
Barry Warsaw writes:
Python 21.12 anyone? :)
Well, for one thing we know that version 42 will be perfect! With current versioning policy, it will take a loooooong time to get there.... Steve -- Associate Professor Division of Policy and Planning Science http://turnbull/sk.tsukuba.ac.jp/ Faculty of Systems and Information Email: turnbull@sk.tsukuba.ac.jp University of Tsukuba Tel: 029-853-5175 Tennodai 1-1-1, Tsukuba 305-8573 JAPAN
On 2018-04-02 22:48, Lukasz Langa wrote:
Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default. There's a number of things that would run faster with it like multiprocessing.
This is too late for 3.7 which is a shame but can we at least bump it for 3.8?
It sounds like a reasonable request. Python 3.4 is out of commission by then. I'm sure the release manager for 3.8 is going to agree with you, too. :) Christian
Given that, go ahead and change in master (3.8).
On Mon, Apr 2, 2018 at 3:13 PM Christian Heimes
On 2018-04-02 22:48, Lukasz Langa wrote:
Pickle protocol version 4.0 was originally defined back in PEP 3154 and shipped as part of Python 3.4 back in 2011. Yet it's still not the default. There's a number of things that would run faster with it like multiprocessing.
This is too late for 3.7 which is a shame but can we at least bump it for 3.8?
It sounds like a reasonable request. Python 3.4 is out of commission by then. I'm sure the release manager for 3.8 is going to agree with you, too. :)
Christian
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/greg%40krypto.org
participants (16)
-
Antoine Pitrou
-
Barry Warsaw
-
Brett Cannon
-
Chris Angelico
-
Christian Heimes
-
Ethan Furman
-
Gregory P. Smith
-
Guido van Rossum
-
Lukasz Langa
-
MRAB
-
Nathaniel Smith
-
Nick Coghlan
-
Paul G
-
Paul Moore
-
Serhiy Storchaka
-
Stephen J. Turnbull