Do we need to remove everything that's deprecated?
We're rebuilding many popular projects with Python 3.11 alpha, and I see many failures like: AttributeError: module 'configparser' has no attribute 'SafeConfigParser'. Did you mean: 'RawConfigParser'? (bpo-45173) ImportError: cannot import name 'formatargspec' from 'inspect' (bpo-45320) AttributeError: '[...]Tests' object has no attribute 'failUnless' (bpo-45162) Are these changes necessary? Does it really cost us that much in maintainer effort to keep a well-tested backwards compatibility alias name, or a function that has a better alternative? I think that rather than helping our users, changes like these are making Python projects painful to maintain. If we remove them to make Python easier for us to develop, is it now actually that much easier to maitain? The current backwards compatibility policy (PEP 387) sets a *minimum* timeline for deprecations and removals -- "deprecation period must last at least two years." But it seems like it's not treated as a minimum: if any contributor sends an issue/PR to remove deprecated functionality, it's merged without much discussion. And it's very easy to "solve" these "issues", since everything is already marked for deletion; I fear we get the same kind of bikeshed/powerplant problem https://bikeshed.com/ for changes that explains for discussion. It's just so much easier to do "spring cleaning" than solve other problems. Note that I am criticizing the *process*; the examples I gave have some people's names attached, and I have no doubt the people acted with best intentions. I'm also not talking about code that's buggy, insecure, or genuinely hard to maintain. If deprecation now means "we've come up with a new way to do things, and you have two years to switch", can we have something else that means "there's now a better way to do things; the old way is a bit worse but continues to work as before"?
The current backwards compatibility policy (PEP 387) sets a *minimum* timeline for deprecations and removals -- "deprecation period must last at least two years."
About the PEP 387 process and the 3 examples. On Fri, Nov 12, 2021 at 11:58 AM Petr Viktorin <encukou@gmail.com> wrote:
AttributeError: module 'configparser' has no attribute 'SafeConfigParser'. Did you mean: 'RawConfigParser'? (bpo-45173)
SafeConfigParser was not even documented, was deprecated since Python 3.2, and emitted a DeprecationWarning.
ImportError: cannot import name 'formatargspec' from 'inspect' (bpo-45320)
It was deprecated in the doc since Python 3.5, and emitted a DeprecationWarning.
AttributeError: '[...]Tests' object has no attribute 'failUnless' (bpo-45162)
Deprecated in the doc since Python 3.1. It emitted a DeprecationWarning.
But it seems like it's not treated as a minimum (...) Note that I am criticizing the *process*
On these examples, the functions were deprecated for way longer than a minimum of 2 Python versions, no? Victor -- Night gathers, and now my watch begins. It shall not end until my death.
On 12. 11. 21 13:51, Victor Stinner wrote:
The current backwards compatibility policy (PEP 387) sets a *minimum* timeline for deprecations and removals -- "deprecation period must last at least two years."
About the PEP 387 process and the 3 examples.
On Fri, Nov 12, 2021 at 11:58 AM Petr Viktorin <encukou@gmail.com> wrote:
AttributeError: module 'configparser' has no attribute 'SafeConfigParser'. Did you mean: 'RawConfigParser'? (bpo-45173)
SafeConfigParser was not even documented, was deprecated since Python 3.2, and emitted a DeprecationWarning.
ImportError: cannot import name 'formatargspec' from 'inspect' (bpo-45320)
It was deprecated in the doc since Python 3.5, and emitted a DeprecationWarning.
AttributeError: '[...]Tests' object has no attribute 'failUnless' (bpo-45162)
Deprecated in the doc since Python 3.1. It emitted a DeprecationWarning.
But it seems like it's not treated as a minimum (...) Note that I am criticizing the *process*
On these examples, the functions were deprecated for way longer than a minimum of 2 Python versions, no?
Yes. And as far as I know, they haven't really caused problems in all that time.
For me, deprecated functions cause me a lot of thinking when I met them as a Python maintainer and as a Python user. Why is it still there? What is its purpose? Is there a better alternative? It's related to the Chesterton's fence principle. Sometimes, reading the doc is enough. Sometimes, I have to dig into the bug tracker and the Git history. In Python, usually, there is a better alternative. A recent example is the asyncore module that I'm proposing to remove. This module has multiple design flaws which cause bugs in corner cases. It's somehow dangerous to use this module. Deprecating the module doesn't help users who continue to use it and may get bugs in production. Removing the module forces user to think about why they chose asyncore and if they can switch to a better alternative. It's supposed to help users to avoid bugs. The gray area is more about "deprecated aliases" and having two ways to do the same things, but one way is deprecated. One example is the removal of collections.MutableMapping: you must now use collections.abc.MutableMapping. Another example is the removal the "U" mode in the open() function: the flag was simply ignored since Python 3.0. So far, the trend is to remove these "aliases" and force users to upgrade this code. Not removing these aliases has been discussed, and it seems like each time, it was decided to remove them. Usually, the "old way" is deprecated for many Python versions, like 5 years if not longer. Using deprecated functions is a threat in terms of technical debt. An application using multiple deprecated functions will break with a future Python version. It's safe to avoid deprecated functions whenever possible. Some deprecated functions have been removed but then restored for 1 or 2 more Python releases, to give more time to users to upgrade their code. At the end, the deprecated code is removed. We can warn developers to pay attention to DeprecationWarning warnings, but sadly, in my experience, the removal is the only trigger which works for everybody. Do you have to repeat "You should check for DeprecationWarning in your code" in every "What's New in Python X.Y?" document? Python 3.9 has such section: https://docs.python.org/dev/whatsnew/3.9.html#you-should-check-for-deprecati... Victor On Fri, Nov 12, 2021 at 11:58 AM Petr Viktorin <encukou@gmail.com> wrote:
We're rebuilding many popular projects with Python 3.11 alpha, and I see many failures like:
AttributeError: module 'configparser' has no attribute 'SafeConfigParser'. Did you mean: 'RawConfigParser'? (bpo-45173)
ImportError: cannot import name 'formatargspec' from 'inspect' (bpo-45320)
AttributeError: '[...]Tests' object has no attribute 'failUnless' (bpo-45162)
Are these changes necessary? Does it really cost us that much in maintainer effort to keep a well-tested backwards compatibility alias name, or a function that has a better alternative?
I think that rather than helping our users, changes like these are making Python projects painful to maintain. If we remove them to make Python easier for us to develop, is it now actually that much easier to maitain?
The current backwards compatibility policy (PEP 387) sets a *minimum* timeline for deprecations and removals -- "deprecation period must last at least two years." But it seems like it's not treated as a minimum: if any contributor sends an issue/PR to remove deprecated functionality, it's merged without much discussion. And it's very easy to "solve" these "issues", since everything is already marked for deletion; I fear we get the same kind of bikeshed/powerplant problem https://bikeshed.com/ for changes that explains for discussion. It's just so much easier to do "spring cleaning" than solve other problems.
Note that I am criticizing the *process*; the examples I gave have some people's names attached, and I have no doubt the people acted with best intentions. I'm also not talking about code that's buggy, insecure, or genuinely hard to maintain.
If deprecation now means "we've come up with a new way to do things, and you have two years to switch", can we have something else that means "there's now a better way to do things; the old way is a bit worse but continues to work as before"?
_______________________________________________ Python-Dev mailing list -- python-dev@python.org To unsubscribe send an email to python-dev-leave@python.org https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/python-dev@python.org/message/AYJOQL36... Code of Conduct: http://python.org/psf/codeofconduct/
-- Night gathers, and now my watch begins. It shall not end until my death.
On 12. 11. 21 14:18, Victor Stinner wrote:
For me, deprecated functions cause me a lot of thinking when I met them as a Python maintainer and as a Python user. Why is it still there? What is its purpose? Is there a better alternative? It's related to the Chesterton's fence principle. Sometimes, reading the doc is enough. Sometimes, I have to dig into the bug tracker and the Git history.
Could you just add a comment when you find the answer? And a note in the docs, for the users?
In Python, usually, there is a better alternative. A recent example is the asyncore module that I'm proposing to remove. This module has multiple design flaws which cause bugs in corner cases. It's somehow dangerous to use this module. Deprecating the module doesn't help users who continue to use it and may get bugs in production. Removing the module forces user to think about why they chose asyncore and if they can switch to a better alternative. It's supposed to help users to avoid bugs.
Right. If something's an attractive-looking trap, that's a reasonable reason to think about removing it. But I'm not talking about that there.
The gray area is more about "deprecated aliases" and having two ways to do the same things, but one way is deprecated. One example is the removal of collections.MutableMapping: you must now use collections.abc.MutableMapping. Another example is the removal the "U" mode in the open() function: the flag was simply ignored since Python 3.0. So far, the trend is to remove these "aliases" and force users to upgrade this code. Not removing these aliases has been discussed, and it seems like each time, it was decided to remove them. Usually, the "old way" is deprecated for many Python versions, like 5 years if not longer.
Using deprecated functions is a threat in terms of technical debt. An application using multiple deprecated functions will break with a future Python version.
But "will break with a future Python version" just means that people's code breaks because *we break it*. If we stopped doing that (in the simple cases of name aliases or functions that are older but not dangerous), then their code wouldn't break.
It's safe to avoid deprecated functions whenever possible. Some deprecated functions have been removed but then restored for 1 or 2 more Python releases, to give more time to users to upgrade their code. At the end, the deprecated code is removed.
We can warn developers to pay attention to DeprecationWarning warnings, but sadly, in my experience, the removal is the only trigger which works for everybody.
Do you have to repeat "You should check for DeprecationWarning in your code" in every "What's New in Python X.Y?" document? Python 3.9 has such section: https://docs.python.org/dev/whatsnew/3.9.html#you-should-check-for-deprecati...
Clearly, that's not working. Python users want to write commits that either bring value, or that are fun. Mass-replacing "failUnless" with "assertTrue" just because someone decided it's a better name is neither. Same with a forced move to the latest version of a function, if you don't use the bells and whistles it added.
I sympathize with the OP, but I think never removing deprecated names is the wrong solution. If never removing those names is the appropriate action, then they never should have been changed in the first place. That is, we should be ( and I think are) very careful about gratuitously changing things. Frankly, changing names in your code is a light lift. If you really want your code to not be touched, keep using an older version of Python. BTW, tools can help a lot here. I make heavy use of Pytest, and it recently changed its default warning policy so that I get the depreciation warnings early and often. It’s just not that hard to keep up if the code is seeing any maintenance at all. It’s a simple reality that code needs to be maintained. Mass-replacing "failUnless" with
"assertTrue" just because someone decided it's a better name
If there really was no better reason than that, the change never should have been made. But once made, keeping multiple names around forever is not a good option. -CHB -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython
Maybe we can provide those deprecated things, but it need people add them manually and dynamically. By this, python can partially support other version python by provided minimum set instead of create a virtual evironment. For example, a big project use "SafeConfigParser" in it, to not change code and change it with new python, it can provide oldname in a .xml file(or other file), than new python can Make corresponding modifications dynamically when project runs.
Victor Stinner writes:
In Python, usually, there is a better alternative.
As in life.
Do you have to repeat "You should check for DeprecationWarning in your code" in every "What's New in Python X.Y?" document?
That probably doesn't hurt, but I doubt it does much good for anybody except the bewildered first-day college intern who wonders why nobody in the devops team follows best practices. They'll learn to ignore those lines pretty quick, too, I'm sure. ;-) What I think would make a difference is a six-like tool for making "easy changes" like substituting aliases and maybe marking other stuff that requires human brains to make the right changes. I'm not volunteering to do this, I don't even know that it's actually feasible. But I think that unless we're willing to bite that bullet, it's going to be difficult to make much progress over the current situation. Deprecated code does normally more or less work, and often it never gets close to dangerous behavior. On the flip side, it often can cause dangerous behavior, and you won't know if it does until you do a thorough audit of your use case, which isn't going to happen because it would take as much effort as replacing the deprecated code. I think we all see both sides, even if our own individual experience leads us to want to change the current balance. Unfortunately, as we see here, some folks want more removals, some peole want less (or none).
On Sat, Nov 13, 2021 at 12:01 AM Stephen J. Turnbull
What I think would make a difference is a six-like tool for making "easy changes" like substituting aliases and maybe marking other stuff that requires human brains to make the right changes.
I think a “2to3” like or “futurize” like tool is a better idea, but yes. The real challenge with the 2-3 transition was that many of us needed to keep the same code base running on both 2 and 3. But do we need to support running the same code on 3.5 to 3.10? I don’t think so. If you can’t upgrade Python to a supported version, you probably shouldn’t upgrade your code or libraries. Which is a thought — maybe the policy should be that we remove things when the new way is usable in all supported versions of Python. So as of today ( if I’m correct) anything needed in 3.5 can be dropped. I'm not volunteering to do this, I don't even know that it's actually
feasible.
It’s clearly feasible— if the transition from 2 to 3 could be done, this is easy :-) Not that I’m volunteering either. But maybe the folks that find updating deprecated features onerous might want to do it (or already have — I haven’t looked) Deprecated code does normally more or less work, and often
it never gets close to dangerous behavior. On the flip side, it often can cause dangerous behavior,
I’m confused — did you mean “sometimes cause dangerous behavior”? That’s pretty rare isn’t it? -CHB -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython
Christopher Barker writes:
On Sat, Nov 13, 2021 at 12:01 AM Stephen J. Turnbull
What I think would make a difference is a six-like tool for making "easy changes" like substituting aliases and maybe marking other stuff that requires human brains to make the right changes.
I think a “2to3” like or “futurize” like tool is a better idea, but yes.
That's what I meant, thanks for the correction.
The real challenge with the 2-3 transition was that many of us needed to keep the same code base running on both 2 and 3. But do we need to support running the same code on 3.5 to 3.10?
Need? No. Want to not raise a big middle finger to our users? Yes. Speaking for GNU Mailman (and I think I can do that in this case without getting lynched by the rest of our crew). I wouldn't mind if the tool gently suggests, ''' hey, folks, you can't really support both 3.5 AND 3.10 without a lot of "if hasattr(foo, 'frob')", so maybe you can drop support for 3.5? ''', though.
I’m confused — did you mean “sometimes cause dangerous behavior”? That’s pretty rare isn’t it?
FVO of "often" == "yeah, I've heard enough stories that I worry about it", I mean "often". We're talking about risk assessments, and I work with Internet-facing code, where there are no risks, just certain disasters at an uncertain but near-future date. ;-) Steve
On Sun, Nov 14, 2021 at 8:19 AM Stephen J. Turnbull < stephenjturnbull@gmail.com> wrote:
But do we need to support running the same code on 3.5 to 3.10?
Need? No. Want to not raise a big middle finger to our users?
Note that I said 3.5, not 3.6 -- 3.5 is no longer supported. If we feel the need to be backward compatible with unsupported versions that we can't ever remove anything. I wouldn't mind if the tool gently suggests, ''' hey, folks, you can't
really support both 3.5 AND 3.10 without a lot of "if hasattr(foo, 'frob')", so maybe you can drop support for 3.5? ''', though.
now I'm confused -- if you need the hasattr() calls, then you aren't supporting it.I guess I meant: runinng the same code without special case code to handle the differences. Which is why I said "like 2to3" rather than "like six". I always hated six, even though it was a necessary evil.
I’m confused — did you mean “sometimes cause dangerous behavior”? That’s
pretty rare isn’t it?
FVO of "often" == "yeah, I've heard enough stories that I worry about it", I mean "often".
hmm -- is there any way to know which deprecations might actually be dangerous? -- for instance, it's hard to imagine a name change alone would be that, but I have a failure of imagination. Eric V. Smith wrote:
I write systems that support old versions of python. We just moved to 3.7, for example.
But do you need to support non longer supported versions of Python -- 3.7 is still supported, for just these reasons. Can we remove stuff that's only needed by unsupported versions of Python? So you have an application running on 3.5. You really should upgrade Python anyway. When you do, you will need to run an "update_deprecated_stuff" script, and test and you're good. Is that too much a burden? Frankly, even the 2to3 transition was less painful than I thought it would be -- I had a substantial codebase written for py2 -- we couldn't go to three for quite some time, as we have a LOT of dependencies, and it was a while before they were all supported. When we finally made the transition it was less painful than I thought it would be and it would have been even less painful if we hadn't had a both-2-and-3 stage. And we've got a bunch of Cython code that bridges strings between Python and C++, too. For all of the testing and signoffs we do for a single release,
I've calculated that it costs us $5k just to release code to prod, even if there are no actual code changes.
But that's a fixed cost -- any maintained codebase is going to need updates and re-releases. I don't think anyone's suggesting that you do a release only to remove deprecations. For the example above -- if ALL you are doing is moving from running on Python 3.5 to running on a newer version, wouldn't that $5k cost have to be absorbed anyway? -CHB -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython
Christopher Barker writes:
On Sun, Nov 14, 2021 at 8:19 AM Stephen J. Turnbull < stephenjturnbull@gmail.com> wrote:
But do we need to support running the same code on 3.5 to 3.10?
Need? No. Want to not raise a big middle finger to our users?
Note that I said 3.5, not 3.6 -- 3.5 is no longer supported. If we feel the need to be backward compatible with unsupported versions that we can't ever remove anything.
I'm not saying *Python* can't remove anything. I'm saying downstream, *GNU Mailman* has users it *may* want to support. Unlike Petr, I'm not <0 on removals. But they are costly, keeping up with deprecations is costly. In theory, I agree with you that we should consider maintenance as an almost-fixed cost that a few deprecations aren't going to increase significantly. In practice, I'm not practiced enough to say. I do see a strong case for pruning stuff that we already found worth deprecation as well. In this thread, I'm most interested in exploring tooling to make it easier for those who express reservations about removals. Steve
On Sun, Nov 14, 2021 at 10:06 PM Stephen J. Turnbull < stephenjturnbull@gmail.com> wrote:
I'm not saying *Python* can't remove anything. I'm saying downstream, *GNU Mailman* has users it *may* want to support.
So a project (not to pick on Mailman) may want to support its users running old versions of the code on new versions of Python? I've been confused about that kind of thing for years -- e.g. numpy has to support Python versions from ten years ago so users can get the latest numpy while running a really old version of Python? I understand that there are sometimes IT policy issues, like "you have to use the system Python, on an old system, but you can still install the latest Python packages' '. And believe me, I work in an institution with many such irrational policies, but they are,indeed, irrational, and sometimes I can use the ammunition of saying, no, I can not run this application on that old OS. Period -- give me an updated system to get my job done. Unlike Petr, I'm not <0 on removals. But they are costly, keeping up
with deprecations is costly.
absolutely, and hopefully that cost was considered when the depreciation is made -- we shouldn't second guess it when it's time to actually remove the old stuff. In this thread, I'm most interested in exploring
tooling to make it easier for those who express reservations about removals.
Maybe reviving the future package to cover Python 3 changes. It appears not to have been updated for two years, which makes sense, as Py2 is no longer supported. But maybe its infrastructure could be updated to accommodate the newer changes. It should be an easier job than making a 2-3 compatible codebase. I found future really helpful in the 2-3 transition, and one nice thing about it is that it provided both translation ala 2to3 and compatibility ala six. -CHB -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython
12.11.21 12:55, Petr Viktorin пише:
AttributeError: '[...]Tests' object has no attribute 'failUnless' (bpo-45162)
This one caused me troubles more then one time. It is so easy to make a typo and write assertEquals instead of assertEqual or assertRaisesRegexp instead of assertRaisesRegex. Tests are passed, and warnings are ignored. Then I run tests with -Werror to test new warnings and get a lot of unrelated failures because of PRs merged at last half-year. I am very glad that these long time ago deprecated aliases are finally removed.
On 11/12/2021 5:55 AM, Petr Viktorin wrote:
If deprecation now means "we've come up with a new way to do things, and you have two years to switch", can we have something else that means "there's now a better way to do things; the old way is a bit worse but continues to work as before"?
I think optparse is a good example of this (not that I love argparse). For things that really are removed (and I won't get in to the reasons for why something must be removed), I think a useful stance is "we won't remove anything that would make it hard to support a single code base across all supported python versions". We'd need to define "hard", maybe "no hasattr calls" would be part of it. Reliable tools to make the migration between versions would help, too. I could live with this, although I write systems that support old versions of python. We just moved to 3.7, for example. Eric PS: Someone else said that my estimate of tens of thousands of dollars to deal with deprecations is too high. If anything, I think it's too low. For all of the testing and signoffs we do for a single release, I've calculated that it costs us $5k just to release code to prod, even if there are no actual code changes. Could that be improved? Sure. Will it? Unlikely. Maybe I'm an outlier, but I doubt it.
On 11/14/2021 11:39 AM, Eric V. Smith wrote:
For things that really are removed (and I won't get in to the reasons for why something must be removed), I think a useful stance is "we won't remove anything that would make it hard to support a single code base across all supported python versions". We'd need to define "hard", maybe "no hasattr calls" would be part of it.
On second thought, I guess the existing policy already does this. Maybe we should make it more than 2 versions for deprecations? I've written libraries where I support 4 or 5 released versions. Although maybe I should just trim that back. Eric
On Sun, Nov 14, 2021 at 6:34 PM Eric V. Smith <eric@trueblade.com> wrote:
On second thought, I guess the existing policy already does this. Maybe we should make it more than 2 versions for deprecations? I've written libraries where I support 4 or 5 released versions. Although maybe I should just trim that back.
If I understood correctly, the problem is more for how long is the new way available? For example, if the new way is introduced in Python 3.6, the old way is deprecated is Python 3.8, can we remove the old way in Python 3.10? It means that the new way is available in 4 versions (3.6, 3.7, 3.8, 3.9), before the old way is removed. It means that it's possible to have a single code base (no test on the Python version and no feature test) for Python 3.6 and newer. More concrete examples: * the "U" open() flag was deprecated since Python 3.0, removed in Python 3.11: the flag was ignored since Python 3.0, code without "U" works on Python 3.0 and newer * collections.abc.MutableMapping exists since Python 3.3: collections.MutableMapping was deprecated in Python 3.3, removed in Python 3.10. Using collections.abc.MutableMapping works on Python 3.3 and newer. * unittest: failIf() alias, deprecated since Python 2.7, was removed in Python 3.11: assertFalse() always worked. For these 3 changes, it's possible to keep support up to Python 3.3. Up to Python 3.0 if you add "try/except ImportError" for collections.abc. IMO it would help to have a six-like module to write code for the latest Python version, and keep support for old Python versions. For example, have hacks to be able to use collections.abc.MutableMapping on Python 3.2 and older (extreme example, who still care about Python older than 3.5 in 2021?). I wrote something like that for the C API, provide *new* C API functions to *old* Python versions: https://github.com/pythoncapi/pythoncapi_compat Victor -- Night gathers, and now my watch begins. It shall not end until my death.
On Mon, Nov 15, 2021 at 7:58 AM Victor Stinner <vstinner@python.org> wrote:
On Sun, Nov 14, 2021 at 6:34 PM Eric V. Smith <eric@trueblade.com> wrote:
On second thought, I guess the existing policy already does this. Maybe we should make it more than 2 versions for deprecations? I've written libraries where I support 4 or 5 released versions. Although maybe I should just trim that back.
If I understood correctly, the problem is more for how long is the new way available?
I think the main problem is how many user code will be broken and the merit of the deletion. For example, PEP 623 will remove some legacy C APIs in Python 3.12. https://www.python.org/dev/peps/pep-0623/ There are a few modules the PEP will break. But the PEP has significant merit (reduce memory usage of all string objects). So I want to remove them with the minimum deprecation period and I am helping people to use new APIs. (*) * e.g. https://github.com/jamesturk/cjellyfish/pull/12 So I don't want to increase the minimum required deprecation period. But I agree that a longer deprecation period is good when keeping deprecation stuff has nearly zero cost. Regards, -- Inada Naoki <songofacandy@gmail.com>
On Sun, Nov 14, 2021 at 3:01 PM Victor Stinner <vstinner@python.org> wrote:
On Sun, Nov 14, 2021 at 6:34 PM Eric V. Smith <eric@trueblade.com> wrote:
On second thought, I guess the existing policy already does this. Maybe we should make it more than 2 versions for deprecations? I've written libraries where I support 4 or 5 released versions. Although maybe I should just trim that back.
If I understood correctly, the problem is more for how long is the new way available?
I think Eric was suggesting more along the lines of PEP 387 saying that deprecations should last as long as there is a supported version of Python that *lacks* the deprecation. So for something that's deprecated in 3.10, we wouldn't remove it until 3.10 is the oldest Python version we support. That would be October 2025 when Python 3.9 reaches EOL and Python 3.13 comes out as at that point you could safely rely on the non-deprecated solution across all supported Python versions (or if you want a full year of overlap, October 2026 and Python 3.14). I think the key point with that approach is if you wanted to maximize your support across supported versions, this would mean there wouldn't be transition code except when the SC approves of a shorter deprecation. So a project would simply rely on the deprecated approach until they started work towards Python 3.13, at which point they drop support for the deprecated approach and cleanly switch over to the new approach as all versions of Python at that point will support the new approach as well. -Brett
For example, if the new way is introduced in Python 3.6, the old way is deprecated is Python 3.8, can we remove the old way in Python 3.10? It means that the new way is available in 4 versions (3.6, 3.7, 3.8, 3.9), before the old way is removed. It means that it's possible to have a single code base (no test on the Python version and no feature test) for Python 3.6 and newer.
More concrete examples:
* the "U" open() flag was deprecated since Python 3.0, removed in Python 3.11: the flag was ignored since Python 3.0, code without "U" works on Python 3.0 and newer
* collections.abc.MutableMapping exists since Python 3.3: collections.MutableMapping was deprecated in Python 3.3, removed in Python 3.10. Using collections.abc.MutableMapping works on Python 3.3 and newer.
* unittest: failIf() alias, deprecated since Python 2.7, was removed in Python 3.11: assertFalse() always worked.
For these 3 changes, it's possible to keep support up to Python 3.3. Up to Python 3.0 if you add "try/except ImportError" for collections.abc.
IMO it would help to have a six-like module to write code for the latest Python version, and keep support for old Python versions. For example, have hacks to be able to use collections.abc.MutableMapping on Python 3.2 and older (extreme example, who still care about Python older than 3.5 in 2021?).
I wrote something like that for the C API, provide *new* C API functions to *old* Python versions: https://github.com/pythoncapi/pythoncapi_compat
Victor -- Night gathers, and now my watch begins. It shall not end until my death. _______________________________________________ Python-Dev mailing list -- python-dev@python.org To unsubscribe send an email to python-dev-leave@python.org https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/python-dev@python.org/message/WD6NLGVI... Code of Conduct: http://python.org/psf/codeofconduct/
On 16. 11. 21 1:11, Brett Cannon wrote:
On Sun, Nov 14, 2021 at 3:01 PM Victor Stinner <vstinner@python.org <mailto:vstinner@python.org>> wrote:
On Sun, Nov 14, 2021 at 6:34 PM Eric V. Smith <eric@trueblade.com <mailto:eric@trueblade.com>> wrote: > On second thought, I guess the existing policy already does this. Maybe > we should make it more than 2 versions for deprecations? I've written > libraries where I support 4 or 5 released versions. Although maybe I > should just trim that back.
If I understood correctly, the problem is more for how long is the new way available?
I think Eric was suggesting more along the lines of PEP 387 saying that deprecations should last as long as there is a supported version of Python that *lacks* the deprecation. So for something that's deprecated in 3.10, we wouldn't remove it until 3.10 is the oldest Python version we support. That would be October 2025 when Python 3.9 reaches EOL and Python 3.13 comes out as at that point you could safely rely on the non-deprecated solution across all supported Python versions (or if you want a full year of overlap, October 2026 and Python 3.14).
I think the key point with that approach is if you wanted to maximize your support across supported versions, this would mean there wouldn't be transition code except when the SC approves of a shorter deprecation. So a project would simply rely on the deprecated approach until they started work towards Python 3.13, at which point they drop support for the deprecated approach and cleanly switch over to the new approach as all versions of Python at that point will support the new approach as well.
That sounds like a reasonable minimum for minor cleanups -- breakage that doesn't block improvements. The current 'two years' minimum (and SC exceptions) is, IMO, appropriate for changes that do block improvements -- e.g. if removing old Unicode APIs allows reorganizing the internals to get a x% speedup, it should be removed after the 2-years of warnings (*if* the speedup is also made in that version -- otherwise the removal can be postponed). Even better if there's some alternate API for the affected use cases which works on all supported Python versions. And then there are truly trivial removals like the "failUnless" or "SafeConfigParser" aliases. I don't see a good reason to remove those -- they could stay deprecated forever. The only danger that API posed to users is that it might be removed in the future (and that will break their code), or that they'll get a warning or a linter nag.
On Tue, Nov 16, 2021 at 4:46 AM Petr Viktorin <encukou@gmail.com> wrote:
On 16. 11. 21 1:11, Brett Cannon wrote:
On Sun, Nov 14, 2021 at 3:01 PM Victor Stinner <vstinner@python.org <mailto:vstinner@python.org>> wrote:
On Sun, Nov 14, 2021 at 6:34 PM Eric V. Smith <eric@trueblade.com <mailto:eric@trueblade.com>> wrote: > On second thought, I guess the existing policy already does this. Maybe > we should make it more than 2 versions for deprecations? I've
written
> libraries where I support 4 or 5 released versions. Although
maybe I
> should just trim that back.
If I understood correctly, the problem is more for how long is the
new
way available?
I think Eric was suggesting more along the lines of PEP 387 saying that deprecations should last as long as there is a supported version of Python that *lacks* the deprecation. So for something that's deprecated in 3.10, we wouldn't remove it until 3.10 is the oldest Python version we support. That would be October 2025 when Python 3.9 reaches EOL and Python 3.13 comes out as at that point you could safely rely on the non-deprecated solution across all supported Python versions (or if you want a full year of overlap, October 2026 and Python 3.14).
I think the key point with that approach is if you wanted to maximize your support across supported versions, this would mean there wouldn't be transition code except when the SC approves of a shorter deprecation. So a project would simply rely on the deprecated approach until they started work towards Python 3.13, at which point they drop support for the deprecated approach and cleanly switch over to the new approach as all versions of Python at that point will support the new approach as
well.
That sounds like a reasonable minimum for minor cleanups -- breakage that doesn't block improvements.
The current 'two years' minimum (and SC exceptions) is, IMO, appropriate for changes that do block improvements -- e.g. if removing old Unicode APIs allows reorganizing the internals to get a x% speedup, it should be removed after the 2-years of warnings (*if* the speedup is also made in that version -- otherwise the removal can be postponed). Even better if there's some alternate API for the affected use cases which works on all supported Python versions.
If enough people come forward supporting this idea then you could propose to the SC that PEP 387 get updated with this guidance. -Brett
And then there are truly trivial removals like the "failUnless" or "SafeConfigParser" aliases. I don't see a good reason to remove those -- they could stay deprecated forever. The only danger that API posed to users is that it might be removed in the future (and that will break their code), or that they'll get a warning or a linter nag.
If deprecations ever become permanent, then there will have to be a cleaning of the stdlib first before we lock the team into this level of support contract.
On 16. 11. 21 20:13, Brett Cannon wrote:
On Tue, Nov 16, 2021 at 4:46 AM Petr Viktorin <encukou@gmail.com <mailto:encukou@gmail.com>> wrote:
On 16. 11. 21 1:11, Brett Cannon wrote: > > > On Sun, Nov 14, 2021 at 3:01 PM Victor Stinner <vstinner@python.org <mailto:vstinner@python.org> > <mailto:vstinner@python.org <mailto:vstinner@python.org>>> wrote: > > On Sun, Nov 14, 2021 at 6:34 PM Eric V. Smith <eric@trueblade.com <mailto:eric@trueblade.com> > <mailto:eric@trueblade.com <mailto:eric@trueblade.com>>> wrote: > > On second thought, I guess the existing policy already does this. > Maybe > > we should make it more than 2 versions for deprecations? I've written > > libraries where I support 4 or 5 released versions. Although maybe I > > should just trim that back. > > If I understood correctly, the problem is more for how long is the new > way available? > > > I think Eric was suggesting more along the lines of PEP 387 saying that > deprecations should last as long as there is a supported version of > Python that *lacks* the deprecation. So for something that's deprecated > in 3.10, we wouldn't remove it until 3.10 is the oldest Python version > we support. That would be October 2025 when Python 3.9 reaches EOL and > Python 3.13 comes out as at that point you could safely rely on the > non-deprecated solution across all supported Python versions (or if you > want a full year of overlap, October 2026 and Python 3.14). > > I think the key point with that approach is if you wanted to maximize > your support across supported versions, this would mean there wouldn't > be transition code except when the SC approves of a shorter deprecation. > So a project would simply rely on the deprecated approach until they > started work towards Python 3.13, at which point they drop support for > the deprecated approach and cleanly switch over to the new approach as > all versions of Python at that point will support the new approach as well.
That sounds like a reasonable minimum for minor cleanups -- breakage that doesn't block improvements.
The current 'two years' minimum (and SC exceptions) is, IMO, appropriate for changes that do block improvements -- e.g. if removing old Unicode APIs allows reorganizing the internals to get a x% speedup, it should be removed after the 2-years of warnings (*if* the speedup is also made in that version -- otherwise the removal can be postponed). Even better if there's some alternate API for the affected use cases which works on all supported Python versions.
If enough people come forward supporting this idea then you could propose to the SC that PEP 387 get updated with this guidance.
Yes, this thread is the first step :)
And then there are truly trivial removals like the "failUnless" or "SafeConfigParser" aliases. I don't see a good reason to remove those -- they could stay deprecated forever. The only danger that API posed to users is that it might be removed in the future (and that will break their code), or that they'll get a warning or a linter nag.
If deprecations ever become permanent, then there will have to be a cleaning of the stdlib first before we lock the team into this level of support contract.
I'm not looking for a contract, rather a best practice. I think we should see Python's benign warts as nice gestures to the users: signs that we're letting them focus on issues that matter to them, rather than forcing them to join a quest for perfection. If a wart turns out to be a tumor, we should be able to remove it after the 2 years of warnings (or less with an exception). That's fine as a contract. But I don't like "spring cleaning" -- removing everything the contract allows us to remove. Ensuring more perfect code should be a job for linters, not the interpreter/stdlib.
On 11/16/2021 7:43 AM, Petr Viktorin wrote:
On 16. 11. 21 1:11, Brett Cannon wrote:
I think the key point with that approach is if you wanted to maximize your support across supported versions, this would mean there wouldn't be transition code except when the SC approves of a shorter deprecation. So a project would simply rely on the deprecated approach until they started work towards Python 3.13, at which point they drop support for the deprecated approach and cleanly switch over to the new approach as all versions of Python at that point will support the new approach as well.
That sounds like a reasonable minimum for minor cleanups -- breakage that doesn't block improvements.
The current 'two years' minimum (and SC exceptions) is, IMO, appropriate for changes that do block improvements -- e.g. if removing old Unicode APIs allows reorganizing the internals to get a x% speedup, it should be removed after the 2-years of warnings (*if* the speedup is also made in that version -- otherwise the removal can be postponed). Even better if there's some alternate API for the affected use cases which works on all supported Python versions.
I agree that the yearly releases make 2 releases with warnings a bit short. Remove when a distributed replacement works in all supported releases seems pretty sensible.
And then there are truly trivial removals like the "failUnless" or "SafeConfigParser" aliases. I don't see a good reason to remove those -- they could stay deprecated forever.
This part I do not agree with. In 3.10, there are 15 fail* and assert* aliases with a messy overlap pattern. https://docs.python.org/3/library/unittest.html#deprecated-aliases This is 15 unneeded names that appear in the doc, the index, vars(), dir(), TestCase.__dict__ listings, completion lists, etc. If not used, there is no need to keep them. If kept 'forever', they will be used, making unittest code harder to read. There was a recent proposal to add permanent _ aliases for all stdlib camelCase names: assert_equal, assert_true, etc. After Guido gave a strong No, the proposal was reduced to doing so for logging and unittest only. If permanent aliases are blessed as normal, the proposal will recur and it would be harder to say no. I expect that there would be disagreements as to what is trivial enough.
The only danger that API posed to users is that it might be removed in the future (and that will break their code), or that they'll get a warning or a linter nag.
Python is nearly 30 years old. I am really glad it is not burdened with 30 years of old names. I expect someone reading this may write some version of Python 50 years from now. I would not want they to have to read about names deprecated 60 years before such a time. -- Terry Jan Reedy
I’ve seen a few people in this thread proposing a new tool to automatically update deprecations but I believe it already exists: pyupgrade. Looking over its fixes once again, I don’t think it covers any of the original three deprecations (maybe someone could open a PR?), but it does cover a lot including failUnlessEqual and assertEquals. Regards, Jeremiah
Maybe once a function is deprecated in Python, pyupgrade should be updated? I mean, more collaboration between Python core devs and the pyupgrade development. https://github.com/asottile/pyupgrade Victor On Thu, Nov 18, 2021 at 8:39 AM Jeremiah Paige <ucodery@gmail.com> wrote:
I’ve seen a few people in this thread proposing a new tool to automatically update deprecations but I believe it already exists: pyupgrade. Looking over its fixes once again, I don’t think it covers any of the original three deprecations (maybe someone could open a PR?), but it does cover a lot including failUnlessEqual and assertEquals.
Regards, Jeremiah _______________________________________________ Python-Dev mailing list -- python-dev@python.org To unsubscribe send an email to python-dev-leave@python.org https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/python-dev@python.org/message/PT2HDZSI... Code of Conduct: http://python.org/psf/codeofconduct/
-- Night gathers, and now my watch begins. It shall not end until my death.
On Wed, Nov 17, 2021 at 12:49 AM Terry Reedy <tjreedy@udel.edu> wrote:
On 11/16/2021 7:43 AM, Petr Viktorin wrote:
On 16. 11. 21 1:11, Brett Cannon wrote:
I think the key point with that approach is if you wanted to maximize your support across supported versions, this would mean there wouldn't be transition code except when the SC approves of a shorter deprecation. So a project would simply rely on the deprecated approach until they started work towards Python 3.13, at which point they drop support for the deprecated approach and cleanly switch over to the new approach as all versions of Python at that point will support the new approach as well.
That sounds like a reasonable minimum for minor cleanups -- breakage that doesn't block improvements.
The current 'two years' minimum (and SC exceptions) is, IMO, appropriate for changes that do block improvements -- e.g. if removing old Unicode APIs allows reorganizing the internals to get a x% speedup, it should be removed after the 2-years of warnings (*if* the speedup is also made in that version -- otherwise the removal can be postponed). Even better if there's some alternate API for the affected use cases which works on all supported Python versions.
I agree that the yearly releases make 2 releases with warnings a bit short. Remove when a distributed replacement works in all supported releases seems pretty sensible.
And then there are truly trivial removals like the "failUnless" or "SafeConfigParser" aliases. I don't see a good reason to remove those -- they could stay deprecated forever.
This part I do not agree with. In 3.10, there are 15 fail* and assert* aliases with a messy overlap pattern. https://docs.python.org/3/library/unittest.html#deprecated-aliases This is 15 unneeded names that appear in the doc, the index, vars(), dir(), TestCase.__dict__ listings, completion lists, etc.
Well, dir(), vars(), __dict__ and similar are already unpleasant -- ever since they started listing dunder methods, quite a long time ago. But we could improve completion, docs and other cases that can filter the list. How about adding a __deprecated__ attribute with a list of names that tab completion should skip?
If not used, there is no need to keep them. If kept 'forever', they will be used, making unittest code harder to read.
There was a recent proposal to add permanent _ aliases for all stdlib camelCase names: assert_equal, assert_true, etc. After Guido gave a strong No, the proposal was reduced to doing so for logging and unittest only. If permanent aliases are blessed as normal, the proposal will recur and it would be harder to say no.
I expect that there would be disagreements as to what is trivial enough.
The only danger that API posed to users is that it might be removed in the future (and that will break their code), or that they'll get a warning or a linter nag.
Python is nearly 30 years old. I am really glad it is not burdened with 30 years of old names. I expect someone reading this may write some version of Python 50 years from now. I would not want they to have to read about names deprecated 60 years before such a time.
If they dedicated section is too distracting. they could be moved to a subpage, reachable mainly by people searching for a particular name. And the instructions on how to modernize code could be right next to them.
This is the point where the pricey support contract comes in. Would give options to those who need it and provide some revenue. Otherwise, the "there's no such thing as a free lunch," factor takes precedence. -Mike
On 19. 11. 21 22:15, Mike Miller wrote:
This is the point where the pricey support contract comes in. Would give options to those who need it and provide some revenue.
Not really; for a pricey support contract would need to freeze things for even longer -- *and* make it an actual contract :) Changing working code just to make it continue to work with a newer Python version is boring. Companies might pay money to not have to do that. Or they might pay their employees to do the work. Either way it's money that could be spent on better things. (And hopefully, in some cases those things will be investing into Python and its ecosystem.) But it's similar with volunteer authors and maintainers of various tools and libraries, who "pay" with their time that could be spent building something useful (or something fun). I believe that each time we force them to do pointless updates in their code, we sap some joy and enthusiasm from the ecosystem. Of course, we need to balance that with the joy and enthusiasm (and yes, corporate money) that core devs pour into improving Python itself. But it's the users that we're making Python for.
Otherwise, the "there's no such thing as a free lunch," factor takes precedence.
That cuts both ways: deleting old ugly code is enjoyable, but it isn't free ;) Full disclosure: I do work for Red Hat, which makes money on pricey support contracts. But Victor Stinner also works here. This thread was motivated by watching rebuilds of Fedora packages with Python 3.11 (https://bugzilla.redhat.com/show_bug.cgi?id=2016048), and asking myself if all the work we're expecting people to do is worth it.
On 11/18/2021 9:52 AM, Petr Viktorin wrote:
On Wed, Nov 17, 2021 at 12:49 AM Terry Reedy <tjreedy@udel.edu> wrote:
And then there are truly trivial removals like the "failUnless" or "SafeConfigParser" aliases. I don't see a good reason to remove those -- they could stay deprecated forever.
This part I do not agree with. In 3.10, there are 15 fail* and assert* aliases with a messy overlap pattern. https://docs.python.org/3/library/unittest.html#deprecated-aliases This is 15 unneeded names that appear in the doc, the index, vars(), dir(), TestCase.__dict__ listings, completion lists, etc.
Well, dir(), vars(), __dict__ and similar are already unpleasant -- ever since they started listing dunder methods, quite a long time ago. But we could improve completion, docs and other cases that can filter the list. How about adding a __deprecated__ attribute with a list of names that tab completion should skip?
No. There would be less, not more reason to censor such names. Names that remain forever so that they can be used forever need to be listed forever. My background point is that there is a real cost to keeping obsolete names forever. I appear to see that cost as higher than you do. -- Terry Jan Reedy
participants (12)
-
Brett Cannon
-
Christopher Barker
-
Eric V. Smith
-
Inada Naoki
-
Jeremiah Paige
-
Mike Miller
-
Petr Viktorin
-
Serhiy Storchaka
-
Stephen J. Turnbull
-
Terry Reedy
-
Victor Stinner
-
zhouwenbonwpu@mail.nwpu.edu.cn