Proposal: go back to enabling DeprecationWarning by default

On the 12-weeks-to-3.7-feature-freeze thread, Jose Bueno & I both mistakenly though the async/await deprecation warnings were missing from 3.6.
They weren't missing, we'd just both forgotten those warnings were off by default (7 years after the change to the default settings in 2.7 & 3.2).
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default, and app devs having to silence it explicitly during application startup (before they start importing third party modules) if they don't want their users seeing it when running on the latest Python version (e.g. this would be suitable for open source apps that get integrated into Linux distros and use the system Python there).
This will also restore the previously clear semantic and behavioural different between PendingDeprecationWarning (hidden by default) and DeprecationWarning (visible by default).
As part of this though, I'd suggest amending the documentation for DeprecationWarning [1] to specifically cover how to turn it off programmatically (`warnings.simplefilter("ignore", DeprecationWarning)`), at the command line (`python -W ignore::DeprecationWarning ...`), and via the environment (`PYTHONWARNINGS=ignore::DeprecationWarning`).
(Structurally, I'd probably put that at the end of the warnings listing as a short introduction to warnings management, with links out to the relevant sections of the documentation, and just use DeprecationWarning as the specific example)
Cheers, Nick.
[1] https://docs.python.org/3/library/exceptions.html#DeprecationWarning

Big +1 from me. The whole point of DeprecationWarnings is to be visible so that they are resolved faster. The current behaviour allows them to go unnoticed for the majority of Python users.
Yury
On Sun, Nov 5, 2017 at 9:05 PM, Nick Coghlan ncoghlan@gmail.com wrote:
On the 12-weeks-to-3.7-feature-freeze thread, Jose Bueno & I both mistakenly though the async/await deprecation warnings were missing from 3.6.
They weren't missing, we'd just both forgotten those warnings were off by default (7 years after the change to the default settings in 2.7 & 3.2).
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default, and app devs having to silence it explicitly during application startup (before they start importing third party modules) if they don't want their users seeing it when running on the latest Python version (e.g. this would be suitable for open source apps that get integrated into Linux distros and use the system Python there).
This will also restore the previously clear semantic and behavioural different between PendingDeprecationWarning (hidden by default) and DeprecationWarning (visible by default).
As part of this though, I'd suggest amending the documentation for DeprecationWarning [1] to specifically cover how to turn it off programmatically (`warnings.simplefilter("ignore", DeprecationWarning)`), at the command line (`python -W ignore::DeprecationWarning ...`), and via the environment (`PYTHONWARNINGS=ignore::DeprecationWarning`).
(Structurally, I'd probably put that at the end of the warnings listing as a short introduction to warnings management, with links out to the relevant sections of the documentation, and just use DeprecationWarning as the specific example)
Cheers, Nick.
[1] https://docs.python.org/3/library/exceptions.html#DeprecationWarning
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com

On Sun, Nov 05, 2017 at 09:20:12PM -0500, Yury Selivanov yselivanov.ml@gmail.com wrote:
Big +1 from me. The whole point of DeprecationWarnings is to be visible
The whole point of DeprecationWarnings is to be visible to developers while in reality they will be visible to users -- and what the users would do with the warnings?
Yury
Oleg.

On 5 Nov, 2017, at 6:29 PM, Oleg Broytman phd@phdru.name wrote:
On Sun, Nov 05, 2017 at 09:20:12PM -0500, Yury Selivanov yselivanov.ml@gmail.com wrote:
Big +1 from me. The whole point of DeprecationWarnings is to be visible
The whole point of DeprecationWarnings is to be visible to developers while in reality they will be visible to users -- and what the users would do with the warnings?
Complain to the authors and make them remove the issue.
https://github.com/requests/requests/issues/3954 https://github.com/requests/requests/issues/3954 https://github.com/scikit-learn-contrib/sklearn-pandas/issues/76 https://github.com/scikit-learn-contrib/sklearn-pandas/issues/76 https://github.com/pandas-dev/pandas/issues/5824 https://github.com/pandas-dev/pandas/issues/5824 https://github.com/pypa/setuptools/issues/472 https://github.com/pypa/setuptools/issues/472
+1 to re-enable this from me, too. At Facebook we are running unit tests and development builds with warnings. Just be aware that for some applications that will spew a lot. Python 3.6's warnings on invalid escapes is a big source of this.
- Ł

On 6 November 2017 at 12:29, Oleg Broytman phd@phdru.name wrote:
On Sun, Nov 05, 2017 at 09:20:12PM -0500, Yury Selivanov yselivanov.ml@gmail.com wrote:
Big +1 from me. The whole point of DeprecationWarnings is to be visible
The whole point of DeprecationWarnings is to be visible to developers while in reality they will be visible to users -- and what the users would do with the warnings?
Hence the proposed documentation change: the responsibility for silencing these warnings (for both their own code and for their dependencies) should rest with *application* developers, with our responsibility as providers of the underlying platform then being to make it completely obvious how to actually do that (it's currently really unclear, with the relevant info being scattered across the list of builtin warnings, different parts of the warnings module and CPython command line usage documentation, with no explicit examples of exactly what you need to write anywhere).
To put that another way:
- if we ever write "import foo" ourselves, then we're a Python developer, and it's our responsibility to work out how to manage DeprecationWarning when it gets raised by either our own code, or the libraries and frameworks that we use - if we ship Python code in a "supply your own runtime" model, such that we have actual non-developer users and operators (rather than solely fellow Python developers) to worry about, then it's still our responsibility to decide whether or not we want to let deprecation warnings appear on stderr (based on what we think is most appropriate for our particular user base) - if we want to categorically ensure our users don't get unexpected deprecation warnings on stderr, then we should be bundling a Python runtime as well (e.g. via PyInstaller or a Linux container image, or by operating a network service), rather than asking users and operators to handle the runtime+application integration step
We've been running the current experiment for 7 years, and the main observable outcome has been folks getting surprised by breaking changes in CPython releases, especially folks that primarily use Python interactively (e.g. for data analysis), or as a scripting engine (e.g. for systems administration).
That means the status quo is defeating the main purpose of DeprecationWarnings (providing hard-to-miss advance notice of upcoming breaking changes in the language definition and standard library), for the sake of letting app developers duck responsibility for managing what their own software writes to stderr.
Cheers, Nick.

Nick Coghlan writes:
Hence the proposed documentation change: the responsibility for silencing these warnings (for both their own code and for their dependencies) should rest with *application* developers,
How do you propose to handle users with legacy apps that they can't or their organization won't or they don't wanna upgrade? As I understand it, their only option would be something global, which they may not want to do.
We've been running the current experiment for 7 years, and the main observable outcome
Well, yeah. You can't observe something that doesn't happen, period.
Bottom line: this is NOT a simple proposal, because it inherently deals in counterfactual reasoning.
Steve

On 6 November 2017 at 16:26, Stephen J. Turnbull turnbull.stephen.fw@u.tsukuba.ac.jp wrote:
Nick Coghlan writes:
Hence the proposed documentation change: the responsibility for silencing these warnings (for both their own code and for their dependencies) should rest with *application* developers,
How do you propose to handle users with legacy apps that they can't or their organization won't or they don't wanna upgrade? As I understand it, their only option would be something global, which they may not want to do.
Put "PYTHONWARNINGS=ignore::DeprecationWarning" before whatever command is giving them the warnings.
Even on Windows, you can put that in a batch file with the actual command you want to run and silence the warnings that way.
This is the same philosophy we applied in PEP 493 to provide a smoother transition to HTTPS verification (via selective application of PYTHONHTTPSVERIFY=0)
We've been running the current experiment for 7 years, and the main observable outcome
Well, yeah. You can't observe something that doesn't happen, period.
Bottom line: this is NOT a simple proposal, because it inherently deals in counterfactual reasoning.
No, it doesn't, as we've tried both approaches now: warning by default for the ~20 years leading up to Python 2.7, and not warning by default for the ~7 years since.
Prior to 2.7, the complaints we received mainly related to app developers wanting to pass responsibility for *their* UX problems to us, and ditto for poorly maintained institutional infrastructure.
So we've tried both ways now, and the status quo has led to *us* failing to provide adequate advance notice of breaking changes to *our* direct users. That's a far more important consideration for CPython's default behaviour than the secondary impact on users of applications that happen to be written in Python that are paying sufficient attention to stderr to be scared by DeprecationWarnings, but not paying sufficient attention to learn what those DeprecationWarnings actually mean.
Cheers, Nick.

I still find this unfriendly to users of Python scripts and small apps who are not the developers of those scripts. (Large apps tend to spit out so much logging it doesn't really matter.)
Isn't there a better heuristic we can come up with so that the warnings tend to be on for developers but off for end users?

06.11.17 09:09, Guido van Rossum пише:
I still find this unfriendly to users of Python scripts and small apps who are not the developers of those scripts. (Large apps tend to spit out so much logging it doesn't really matter.)
Isn't there a better heuristic we can come up with so that the warnings tend to be on for developers but off for end users?
There was a proposition to make deprecation warnings visible by default in debug build and interactive interpreter.

2017-11-06 8:47 GMT+01:00 Serhiy Storchaka storchaka@gmail.com:
06.11.17 09:09, Guido van Rossum пише:
I still find this unfriendly to users of Python scripts and small apps who are not the developers of those scripts. (Large apps tend to spit out so much logging it doesn't really matter.)
Isn't there a better heuristic we can come up with so that the warnings tend to be on for developers but off for end users?
There was a proposition to make deprecation warnings visible by default in debug build and interactive interpreter.
The problem is that outside CPython core developers, I expect that almost nobody runs a Python compiled in debug mode. We should provide debug features in the release build. For example, in Python 3.6, I added debug hooks on memory allocation in release mode using PYTHONMALLOC=debug. These hooks were already enabled by default in debug mode.
Moreover, applications are not developed nor tested in the REPL.
Last year, I proposed a global "developer mode". The idea is to provide the same experience than a Python debug build, but on a Python release build:
python3 -X dev script.py or PYTHONDEV=1 python3 script.py behaves as PYTHONMALLOC=debug python3 -Wd -b -X faulthandler script.py
* Show DeprecationWarning and ResourceWarning warnings: python -Wd * Show BytesWarning warning: python -b * Enable Python assertions (assert) and set __debug__ to True: remove (or just ignore) -O or -OO command line arguments * faulthandler to get a Python traceback on segfault and fatal errors: python -X faulthandler * Debug hooks on Python memory allocators: PYTHONMALLOC=debug
If you don't follow the CPython development, it's hard to be aware of "new" options like -X faulthandler (Python 3.3) or PYTHONMALLOC=debug (Python 3.6). And it's easy to forget an option like -b.
Maybe we even a need -X dev=strict which would be stricter:
* use -Werror instead of -Wd: raise an exception when a warning is emitted * use -bb instead of -b: get BytesWarning exceptions * Replace "inconsistent use of tabs and spaces in indentation" warning with an error in the Python parser * etc.
https://mail.python.org/pipermail/python-ideas/2016-March/039314.html
Victor

On Mon, 6 Nov 2017 at 00:30 Victor Stinner victor.stinner@gmail.com wrote:
2017-11-06 8:47 GMT+01:00 Serhiy Storchaka storchaka@gmail.com:
06.11.17 09:09, Guido van Rossum пише:
I still find this unfriendly to users of Python scripts and small apps
who
are not the developers of those scripts. (Large apps tend to spit out so much logging it doesn't really matter.)
Isn't there a better heuristic we can come up with so that the warnings tend to be on for developers but off for end users?
There was a proposition to make deprecation warnings visible by default
in
debug build and interactive interpreter.
The problem is that outside CPython core developers, I expect that almost nobody runs a Python compiled in debug mode. We should provide debug features in the release build. For example, in Python 3.6, I added debug hooks on memory allocation in release mode using PYTHONMALLOC=debug. These hooks were already enabled by default in debug mode.
Moreover, applications are not developed nor tested in the REPL.
Last year, I proposed a global "developer mode". The idea is to provide the same experience than a Python debug build, but on a Python release build:
python3 -X dev script.py or PYTHONDEV=1 python3 script.py behaves as PYTHONMALLOC=debug python3 -Wd -b -X faulthandler script.py
- Show DeprecationWarning and ResourceWarning warnings: python -Wd
- Show BytesWarning warning: python -b
- Enable Python assertions (assert) and set __debug__ to True: remove
(or just ignore) -O or -OO command line arguments
- faulthandler to get a Python traceback on segfault and fatal errors:
python -X faulthandler
- Debug hooks on Python memory allocators: PYTHONMALLOC=debug
If you don't follow the CPython development, it's hard to be aware of "new" options like -X faulthandler (Python 3.3) or PYTHONMALLOC=debug (Python 3.6). And it's easy to forget an option like -b.
Maybe we even a need -X dev=strict which would be stricter:
- use -Werror instead of -Wd: raise an exception when a warning is emitted
- use -bb instead of -b: get BytesWarning exceptions
- Replace "inconsistent use of tabs and spaces in indentation" warning
with an error in the Python parser
- etc.
I like this idea and would argue that `-X dev` should encompass what's proposed for `-X dev=strict` and just have it be strict to begin with. Then we can add an equivalent environment variable and push people who use CI to just blindly set the environment variable in their tests.

On 6 November 2017 at 17:09, Guido van Rossum guido@python.org wrote:
I still find this unfriendly to users of Python scripts and small apps who are not the developers of those scripts.
At a distro level, these warnings being off by default has actually turned out to be a problem, as it's precisely those users of Python scripts and small apps running in the system Python that don't find out they're at risk of a future distro upgrade breaking their tools until they hit the release where they actually break. They then go to the developers of either the distro or those tools saying "Everything is broken, now what do I do?", rather than less urgently asking "Hey, what's up with this weird warning I'm getting now?".
So compared to that current experience of "My distro upgrade broke my stuff", getting back to the occasional "After my distro upgrade, a bunch of my stuff is now emitting messages I don't understand on stderr" sounds likes a positive improvement to me :)
Isn't there a better heuristic we can come up with so that the warnings tend to be on for developers but off for end users?
That depends on where you're drawing the line between "developer" and "end user". Someone working on a new version of Django, for example, would probably qualify as an end user from our perspective, while they'd be a framework developer from the point of view of someone building a website.
If we're talking about "Doesn't even know what Python is" end users, then the most reliable way for devs to ensure they never see a deprecation warning is to bundle Python with the application, instead of expecting end users to handle the task of integrating the two together.
If we're talking about library and frameworks developers, then the only reliable way to distinguish between deprecation warnings that are encountered because a dependency has a future compatibility problem and those that are internal to the application is to use module filtering:
warnings.filterwarnings("ignore", category=DeprecationWarning) warnings.filterwarnings("once", category=DeprecationWarning, module="myproject.*") warnings.filterwarnings("once", category=DeprecationWarning, module="__main__.*")
This model allows folks to more selectively opt-in to getting warnings from their direct dependencies, while ignoring warnings from further down their dependency stack.
As things currently stand though, there's little inherent incentive for new Python users to start learning how to do any of this - instead, the default behaviour for the last several years has been "Breaking API changes just happen sometimes without any prior warning", and you have to go find out how to say "Please tell me when breaking changes are coming" (and remember to opt in to that every time you start Python) before you get any prior notification.
I do like Barry's suggestion of introducing a gentler API specifically for filtering deprecations warnings, though, as I find the warnings filtering system to be a bit like logging, in that it's sufficiently powerful and flexible that getting started with it can be genuinely confusing and intimidating.
In relation to that, the "warn" module README at https://pypi.python.org/pypi/warn/ provides some additional examples of how it can currently be difficult to craft a good definition of which deprecation warnings someone actually wants to see.
Cheers, Nick.
P.S. That README also points out another problem with the status quo: DeprecationWarning still gets silenced by default when encountered in third party modules as well, meaning that also shows up as an abrupt compatibility break for anyone that didn't already know they needed to opt-in to get deprecation warnings.

On 6 November 2017 at 05:09, Guido van Rossum guido@python.org wrote:
I still find this unfriendly to users of Python scripts and small apps who are not the developers of those scripts. (Large apps tend to spit out so much logging it doesn't really matter.)
Isn't there a better heuristic we can come up with so that the warnings tend to be on for developers but off for end users?
-- --Guido van Rossum (python.org/~guido)
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/jsbueno%40python.org.br

Sorry - trigger happy on the previous message.
On 6 November 2017 at 05:09, Guido van Rossum guido@python.org wrote:
I still find this unfriendly to users of Python scripts and small apps who are not the developers of those scripts. (Large apps tend to spit out so much logging it doesn't really matter.)
Isn't there a better heuristic we can come up with so that the warnings tend to be on for developers but off for end users?
So, I don't know who is the "Jose Bueno" mentioned in the first message - :-) I just conveyed a concern from the Brython developers, as I follow the project - and I'd rather have my terminal clean when using programs.
Making some thing when one does "python setup.py develop" to flip the switches so that the deprecation warnings get turned on might be a nice idea, and overall help people, tough.
joao bueno -><-
-- --Guido van Rossum (python.org/~guido)
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/jsbueno%40python.org.br

On 6 November 2017 at 03:38, Nick Coghlan ncoghlan@gmail.com wrote:
- if we ever write "import foo" ourselves, then we're a Python
developer, and it's our responsibility to work out how to manage DeprecationWarning when it gets raised by either our own code, or the libraries and frameworks that we use
As someone who was bitten by this when deprecation warnings were displayed by default, what's the process for suppressing deprecation warnings in modules that I import (and hence have no control over) *without* also suppressing them for my code (where I do want to fix them, so that my users don't have a problem)?
That's the complicated bit that needs to be in the docs - more so than a simple pointer to how to suppress the warning altogether.
On 6 November 2017 at 06:38, Nick Coghlan ncoghlan@gmail.com wrote:
Put "PYTHONWARNINGS=ignore::DeprecationWarning" before whatever command is giving them the warnings.
Even on Windows, you can put that in a batch file with the actual command you want to run and silence the warnings that way.
Batch files do not behave the same in Windows as standard executables. Having to wrap a "normal application" (for example, a script wrapper installed via "pip install package" in a bat file is (a) messy for inexperienced users, and (b) likely to cause weird errors (for example nesting bat files is broken, so you can't use a "wrapped" command transparently in another bat file without silent errors).
Paul

On 6 November 2017 at 20:21, Paul Moore p.f.moore@gmail.com wrote:
On 6 November 2017 at 03:38, Nick Coghlan ncoghlan@gmail.com wrote:
- if we ever write "import foo" ourselves, then we're a Python
developer, and it's our responsibility to work out how to manage DeprecationWarning when it gets raised by either our own code, or the libraries and frameworks that we use
As someone who was bitten by this when deprecation warnings were displayed by default, what's the process for suppressing deprecation warnings in modules that I import (and hence have no control over) *without* also suppressing them for my code (where I do want to fix them, so that my users don't have a problem)?
That's the complicated bit that needs to be in the docs - more so than a simple pointer to how to suppress the warning altogether.
For "top level" deprecation warnings in the libraries you use (i.e. those where the specific API you're calling from your code is either the one that calls warnings.warn, or else it adjusts the stack level argument so it acts that way), the warnings configuration looks like:
warnings.filterwarnings("ignore", category=DeprecationWarning) warnings.filterwarnings("once", category=DeprecationWarning, module="myproject.*") warnings.filterwarnings("once", category=DeprecationWarning, module="__main__")
So that could stand to be made cleaner in a few specific ways:
1. Provide a dedicated API for configuring the deprecation warnings filtering 2. When given a module name, also enable warnings for submodules of that module
Given those design guidelines, an improvement may look like:
warnings.ignoredeprecations(except_for=["myproject","__main__"])
A middle ground between the status quo and full re-enablement of deprecation warnings would also be to add the following to the default filter set used when neither -W nor PYTHONWARNINGS is set:
warnings.filterwarnings("once", category=DeprecationWarning, module="__main__")
That way, warnings would be emitted by default for the REPL and top-level scripts, but getting them for imported libraries would continue to be opt-in.
Cheers, Nick.

On Sun, Nov 5, 2017 at 9:38 PM, Nick Coghlan ncoghlan@gmail.com wrote:
We've been running the current experiment for 7 years, and the main observable outcome has been folks getting surprised by breaking changes in CPython releases, especially folks that primarily use Python interactively (e.g. for data analysis), or as a scripting engine (e.g. for systems administration).
It's also caused lots of projects to switch to using their own ad hoc warning types for deprecations, e.g. off the top of my head:
https://github.com/matplotlib/matplotlib/blob/6c51037864f9a4ca816b68ede78207... https://github.com/numpy/numpy/blob/d75b86c0c49f7eb3ec60564c2e23b3ff237082a2... https://github.com/python-trio/trio/blob/f50aa8e00c29c7f2953b7bad38afc620772...
So in some ways the change has actually made it *harder* for end-user applications/scripts to hide all deprecation warnings, because for each package you use you have to somehow figure out which idiosyncratic type it uses, and filter them each separately.
(In any changes though please do keep in mind that Python itself is not the only one issuing deprecation warnings. I'm thinking in particular of the filter-based-on-Python-version idea. Maybe you could have subclasses like Py35DeprecationWarning and filter on those?)
-n

On 6 November 2017 at 13:05, Nick Coghlan ncoghlan@gmail.com wrote:
As part of this though, I'd suggest amending the documentation for DeprecationWarning [1] to specifically cover how to turn it off programmatically (`warnings.simplefilter("ignore", DeprecationWarning)`), at the command line (`python -W ignore::DeprecationWarning ...`), and via the environment (`PYTHONWARNINGS=ignore::DeprecationWarning`).
I'm wondering if it would be sensible to recommend only disabling the warnings if running with a known version of Python e.g.
if sys.version_info < (3, 8): with warnings.simplefilter('ignore', DeprecationWarning): import module
The idea here is to prompt the developer to refactor to not use the deprecated functionality early enough that users aren't impacted.
Tim Delaney

On Nov 5, 2017, at 18:05, Nick Coghlan ncoghlan@gmail.com wrote:
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default
+1
As part of this though, I'd suggest amending the documentation for DeprecationWarning [1] to specifically cover how to turn it off programmatically (`warnings.simplefilter("ignore", DeprecationWarning)`), at the command line (`python -W ignore::DeprecationWarning ...`), and via the environment (`PYTHONWARNINGS=ignore::DeprecationWarning`).
+1
I’d also consider adding convenient shortcuts for each of these. I think DeprecationWarning is special enough to warrant it. Possibly:
warnings.silence_deprecations() python -X silence-deprecations PYTHONSILENCEDEPRECATIONS=x
Cheers, -Barry

On 6 November 2017 at 14:14, Barry Warsaw barry@python.org wrote:
On Nov 5, 2017, at 18:05, Nick Coghlan ncoghlan@gmail.com wrote:
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default
+1
As part of this though, I'd suggest amending the documentation for DeprecationWarning [1] to specifically cover how to turn it off programmatically (`warnings.simplefilter("ignore", DeprecationWarning)`), at the command line (`python -W ignore::DeprecationWarning ...`), and via the environment (`PYTHONWARNINGS=ignore::DeprecationWarning`).
+1
I’d also consider adding convenient shortcuts for each of these. I think DeprecationWarning is special enough to warrant it. Possibly:
warnings.silence_deprecations() python -X silence-deprecations PYTHONSILENCEDEPRECATIONS=x
It could be interesting to combine this with Tim's suggestion of putting an upper version limit on the silencing, so the above may look like:
warnings.ignore_deprecations((3, 7)) python -X ignore-deprecations=3.7 PYTHONIGNOREDEPRECATIONS=3.7
(Using "ignore" to match the existing action name so the intent is a bit more self-explanatory)
The ignore_deprecations function would then look like:
def ignore_deprecations(max_version): """Ignore DeprecationWarning on Python versions up to & including the given one
*max_version* is an iterable suitable for ordered comparison against sys.version_info """ if sys.version_info <= max_version: warnings.simplefilter('ignore', DeprecationWarning)
So the conventional usage would be that if you were regularly updating your application, by the time Python 3.8 actually existed, the check would have been bumped to say 3.8. But if you stopped updating (or the publisher stopped releasing updates), you'd eventually start getting deprecation warnings again as the underlying Python version changed.
I could definitely see that working well for the community Linux distro use case, where there isn't necessarily anyone closely monitoring old packages to ensure they're actively tracking upstream releases (and attempting to institute more ruthless pruning processes can lead to potentially undesirable community dynamics)
Cheers, Nick.

On Nov 5, 2017, at 20:47, Nick Coghlan ncoghlan@gmail.com wrote:
warnings.silence_deprecations() python -X silence-deprecations PYTHONSILENCEDEPRECATIONS=x
It could be interesting to combine this with Tim's suggestion of putting an upper version limit on the silencing, so the above may look like:
warnings.ignore_deprecations((3, 7)) python -X ignore-deprecations=3.7 PYTHONIGNOREDEPRECATIONS=3.7
That could be cool as long as we also support wildcards, e.g. defaults along the lines of my suggestions above to ignore everything.
-Barry

On 11/6/2017 1:12 PM, Barry Warsaw wrote:
On Nov 5, 2017, at 20:47, Nick Coghlan ncoghlan@gmail.com wrote:
warnings.silence_deprecations() python -X silence-deprecations PYTHONSILENCEDEPRECATIONS=x
It could be interesting to combine this with Tim's suggestion of putting an upper version limit on the silencing, so the above may look like:
warnings.ignore_deprecations((3, 7)) python -X ignore-deprecations=3.7 PYTHONIGNOREDEPRECATIONS=3.7
That could be cool as long as we also support wildcards, e.g. defaults along the lines of my suggestions above to ignore everything.
I'd like to see a command line or environment variable that says: "turn on deprecation warnings (and/or pending deprecation warnings), but do not show warnings for this list of modules (possibly regex's)".
Like: PYTHONDEPRECATIONWARNINGSEXCEPTFOR=PIL,requests.*
Then I'd just turn it on for all modules (empty string?), and when I got something that was flooding me with output I'd add it to the list.
Eric.

06.11.17 04:05, Nick Coghlan пише:
On the 12-weeks-to-3.7-feature-freeze thread, Jose Bueno & I both mistakenly though the async/await deprecation warnings were missing from 3.6.
They weren't missing, we'd just both forgotten those warnings were off by default (7 years after the change to the default settings in 2.7 & 3.2).
Following issues on GitHub related to new Python releases I have found that many projects try to fix deprecation warning, but there are projects that are surprised by ending of deprecation periods and removing features.
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default, and app devs having to silence it explicitly during application startup (before they start importing third party modules) if they don't want their users seeing it when running on the latest Python version (e.g. this would be suitable for open source apps that get integrated into Linux distros and use the system Python there).
This will also restore the previously clear semantic and behavioural different between PendingDeprecationWarning (hidden by default) and DeprecationWarning (visible by default).
There was a proposition to make DeprecationWarning visible by default in debug builds and in interactive interpreter.
What if first implement this idea in 3.7 and make DeprecationWarning visible by default in production scripts only in 3.8? This will make less breakage.

On Nov 5, 2017, at 23:08, Serhiy Storchaka storchaka@gmail.com wrote:
Following issues on GitHub related to new Python releases I have found that many projects try to fix deprecation warning, but there are projects that are surprised by ending of deprecation periods and removing features.
Like others here, I’ve also been bitten by silently ignored DeprecationWarnings. We had some admittedly dodgy code in a corner of Mailman that we could have fixed earlier if we’d seen the warnings. But we never did, so the first indication of a problem was when code actually *broke* with the new version of Python. The problem was compounded because it wasn’t us that saw it first, it was a user, so now they had a broken installation and we had to issue a hot fix. If we’d seen the DeprecationWarnings in the previous version of Python, we would have fixed them and all would have been good.
It’s true that those warnings can cause problems though. There are certain build/CI environments, e.g. in Ubuntu, that fail when they see unexpected stderr output. So when we start seeing new deprecations, we got build(-ish) time failures. I still think that’s a minor price to pay for projects that *want* to do the right thing but don’t because those warnings are essentially hidden until they are breakages. We have tools to help, so let’s use them.
Staying current and code clean is hard and never ending. Welcome to software development!
-Barry

On Mon, 6 Nov 2017 12:05:07 +1000 Nick Coghlan ncoghlan@gmail.com wrote:
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default, and app devs having to silence it explicitly during application startup (before they start importing third party modules) if they don't want their users seeing it when running on the latest Python version (e.g. this would be suitable for open source apps that get integrated into Linux distros and use the system Python there).
This will also restore the previously clear semantic and behavioural different between PendingDeprecationWarning (hidden by default) and DeprecationWarning (visible by default).
I'm on the fence on this.
I was part of the minority who opposed the original decision. So I really appreciate your sentiment. Since then, I had to deal with a lot of very diverse third-party libraries, and I learned that:
- most third-party libraries don't ever emit PendingDeprecationWarning; they only emit DeprecationWarning. So all their warnings would now be visible by default. (1)
- release cycles are much shorter on third-party libraries, so it's easier not to notice that one of your dependencies has started changing some of its APIs - maybe you'll notice in 3 months. Also, perhaps you need a compatibility fallback anyway instead of unconditionally switching to the new version of the API, which adds to the maintenance cost.
- depending on not-well-maintained third-party libraries is a fact of life; these libraries may induce a lot of DeprecationWarnings on their dependencies, and still work fine until some maintainer comes out from the grave (or steps briefly into it before returning to their normal non-programming life) to apply a proper fix and make a release.
The one part where I think your proposal is good (apart from making things a bit simpler for developers) is that I also noticed some authors of third-party libraries don't notice until late that their code emits DeprecationWarnings in dependencies. By knowing earlier (and having their users affected) they may be enticed to fix those issues earlier. But that's only true for third-party libraries with an active enough maintainer, and a tight enough release schedule.
As for why (1) happens, I think it's partly because changing from one warning to another is cumbersome; partly because many libraries don't want to be constrained by a long deprecation cycle.
Regards
Antoine.

On Mon, 6 Nov 2017 12:45:27 +0100 Antoine Pitrou solipsis@pitrou.net wrote:
I'm on the fence on this.
I was part of the minority who opposed the original decision. So I really appreciate your sentiment. Since then, I had to deal with a lot of very diverse third-party libraries, and I learned that:
most third-party libraries don't ever emit PendingDeprecationWarning; they only emit DeprecationWarning. So all their warnings would now be visible by default. (1)
release cycles are much shorter on third-party libraries, so it's easier not to notice that one of your dependencies has started changing some of its APIs - maybe you'll notice in 3 months. Also, perhaps you need a compatibility fallback anyway instead of unconditionally switching to the new version of the API, which adds to the maintenance cost.
Of course, there's also the case where it's own of your dependency's dependencies, or your dependency's dependency's dependencies, or your dependency's dependency's dependency's dependencies, that has started to emit such warning because of how your depencency, or your dependency's dependency, or your dependency's dependency's dependency, calls into that library.
Now if you have several such dependencies (or dependencies' dependencies, etc.), it becomes both likely and annoying to solve / workaround.
I guess my takeaway point is that many situations are complicated, and many third-party library developers are much less disciplined than what some of us would idealistically expect them to be (those developers probably often have good reasons for that). For someone who takes care to only use selected third-party libraries of high maintenance quality, I'm very +1 on your proposal. For the more murky (but rather common) cases of relying on average quality third-party libraries, I'm +0.
Regards
Antoine.

On 6 November 2017 at 21:58, Antoine Pitrou solipsis@pitrou.net wrote:
I guess my takeaway point is that many situations are complicated, and many third-party library developers are much less disciplined than what some of us would idealistically expect them to be (those developers probably often have good reasons for that). For someone who takes care to only use selected third-party libraries of high maintenance quality, I'm very +1 on your proposal. For the more murky (but rather common) cases of relying on average quality third-party libraries, I'm +0.
Agreed, and I'm thinking there could be a lot of value in the variant of the idea that says:
- tweak the default warning filters to turn DeprecationWarning back on for __main__ only - add a new warnings module API specifically for managing deprecation warnings
The first change would restore DeprecationWarning-by-default for:
- ad hoc single file scripts (potentially including Jupyter notebooks, depending on which execution namespace kernels use) - ad hoc experimentation at the REPL - working through outdated examples at the REPL
For installed applications using setuptools (or similar), "__main__" is the script wrapper, not any of the application code, and those have been getting more minimal over time (and when they do have non-trivial code in them, it's calling into setuptools/pkg_resources rather than the standard library).
The second change would be designed around making it easier for app developers to say "Always emit DeprecationWarnings for my own code, don't worry about anything else".
With DeprecationWarning still off by default, that might look like:
warnings.reportdeprecations("myproject")
Cheers, Nick.
P.S. For those interested, the issue where we switched to the current behaviour is https://bugs.python.org/issue7319
And the related stdlib-sig thread is https://mail.python.org/pipermail/stdlib-sig/2009-November/000789.html
That was apparently in the long gone era when I still read every python-checkins message, so there's also a very short thread on python-dev after it landed in SVN: https://mail.python.org/pipermail/python-dev/2010-January/097178.html
The primary argument for the change in the stlib-sig thread is definitely "App devs don't hide deprecation warnings, and then their users complain about seeing them". Guido even goes so far to describe app developers using the warnings filter interface as designed to manage what they emit on stderr as "a special hack".
Later in the thread Georg Brandl brought up a fairly compelling argument that Guido was right about that, which is that programmatic filters to manage warnings currently don't compose well with the command line and environment variable settings, since there's only one list of warning filters, which means there's no way to say "put this *before* the default filters, but *after* any filters that were specified explicitly with -W or PYTHONWARNINGS". Instead, your options are limited to prepending (the default behaviour) which overrides both the defaults and any specific settings, or appending, which means you can't even override the defaults.
When DeprecationWarnings were enabled by default, this meant there was no native way to override application level filters that ignored them in order to turn on DeprecationWarnings when running your test suite.
By contrast, having them be off by default with runtime programmatic filter manipulation being rare offers a simple way to turn them on globally, via "-Wd". Adding "-W once::DeprecationWarning:__main__" to the default filter list doesn't change that substantially.
That said, this does make me wonder whether the warnings module should either place a sentinel marker in its warning filter list to mark where the default filters start (and adjust the append mode to insert filters there), or else provide a new option for programmatic configuration that's "higher priority than the defaults, lower priority than the explicit configuration settings".

On Mon, 6 Nov 2017 23:23:25 +1000 Nick Coghlan ncoghlan@gmail.com wrote:
On 6 November 2017 at 21:58, Antoine Pitrou solipsis@pitrou.net wrote:
I guess my takeaway point is that many situations are complicated, and many third-party library developers are much less disciplined than what some of us would idealistically expect them to be (those developers probably often have good reasons for that). For someone who takes care to only use selected third-party libraries of high maintenance quality, I'm very +1 on your proposal. For the more murky (but rather common) cases of relying on average quality third-party libraries, I'm +0.
Agreed, and I'm thinking there could be a lot of value in the variant of the idea that says:
- tweak the default warning filters to turn DeprecationWarning back on
for __main__ only
Thats sounds error-prone. I'd rather have them on by default everywhere.
- add a new warnings module API specifically for managing deprecation warnings
+1
And I think we need to handle two different use cases:
- silencing warnings *emitted by* a certain module (e.g. a widely-used module which recently introduced major API changes)
- silencing warnings *reported in* a certain module (e.g. a sporadically-maintained library whose usage frequently emits deprecation warnings coming from other libraries)
Ideally, we also need a CLI switch (or environment variable) to override these settings, so that one can run in "dev mode" and see all problematic usage accross their library, application and third-party dependencies.
Regards
Antoine.

Antoine Pitrou wrote:
On Mon, 6 Nov 2017 23:23:25 +1000
- tweak the default warning filters to turn DeprecationWarning back on
for __main__ only
Thats sounds error-prone. I'd rather have them on by default everywhere.
If DeprecationWarnings were on by default, and setuptools were modified to silence them in entry point generated mains, and we had a simple API to easily silence them for manually written mains, wouldn't that handle the majority of relevant use cases nicely?
-Barry

To cut this thread short, I say we should use Nick's proposal to turn these warnings on for __main__ but off elsewhere. (See https://mail.python.org/pipermail/python-dev/2017-November/150364.html.)

Hi Guido,
As far as I can see, the general consensus seems to be to turn them back on in general: The last person to argue against it was Paul Moore, and he since said:
“OK, I overstated [that you’re ‘hosed’ by DeprecationWarnings appearing]. Apologies. My recollection is of a lot more end user complaints when deprecation warnings were previously switched on than others seem to remember, but I can't find hard facts, so I'll assume I'm misremembering.”
Besides, quite some of the problems people mention would only be fixed by turning them on in general, not with the compromise.
So I don’t think we need a compromise, right?
Best, Philipp
Guido van Rossum guido@python.org schrieb am Mi., 8. Nov. 2017 um 03:46 Uhr:
To cut this thread short, I say we should use Nick's proposal to turn these warnings on for __main__ but off elsewhere. (See https://mail.python.org/pipermail/python-dev/2017-November/150364.html.)
-- --Guido van Rossum (python.org/~guido) _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/flying-sheep%40web.de

Philipp,
You seem to have missed Nick's posts where he clearly accepts that a middle ground is necessary. R D Murray is also still unconvinced. (And obviously I myself am against reverting to the behavior from 7 years ago.) If we can't agree on some middle ground, the status quo will be maintained.
On Tue, Nov 7, 2017 at 11:47 PM, Philipp A. flying-sheep@web.de wrote:
Hi Guido,
As far as I can see, the general consensus seems to be to turn them back on in general: The last person to argue against it was Paul Moore, and he since said:
“OK, I overstated [that you’re ‘hosed’ by DeprecationWarnings appearing]. Apologies. My recollection is of a lot more end user complaints when deprecation warnings were previously switched on than others seem to remember, but I can't find hard facts, so I'll assume I'm misremembering.”
Besides, quite some of the problems people mention would only be fixed by turning them on in general, not with the compromise.
So I don’t think we need a compromise, right?
Best, Philipp
Guido van Rossum guido@python.org schrieb am Mi., 8. Nov. 2017 um 03:46 Uhr:
To cut this thread short, I say we should use Nick's proposal to turn these warnings on for __main__ but off elsewhere. (See https://mail.python.org/pipermail/python-dev/2017-November/150364.html.)
-- --Guido van Rossum (python.org/~guido) _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/ flying-sheep%40web.de

On Nov 8, 2017, at 08:47, Guido van Rossum guido@python.org wrote:
You seem to have missed Nick's posts where he clearly accepts that a middle ground is necessary. R D Murray is also still unconvinced. (And obviously I myself am against reverting to the behavior from 7 years ago.) If we can't agree on some middle ground, the status quo will be maintained.
I haven’t seen a response to my suggestion, so it’s possible that it got missed in the flurry. With coordination with setuptools, we could:
* Re-enable DeprecationWarning by default * Add a simplified API for specifically silencing DeprecationWarnings * Modify setuptools to call this API for generated entry point scripts
I think this would mean that most application users would still not see the warnings. The simplified API would be available for handcrafted scripts to call to accomplish the same thing the setuptools enhancement would provide. Developers would see DeprecationWarnings in their development and test environments.
The simplified API would be the equivalent of ignore::DeprecationWarning, so with some additional documentation even versions of applications running on versions of Python < 3.7 would still have an “out”. (Yes, the simplified API is just a convenience moving forward.)
Cheers, -Barry

I hadn't seen that, but it requires too much cooperation of library owners.
On Wed, Nov 8, 2017 at 10:56 AM, Barry Warsaw barry@python.org wrote:
On Nov 8, 2017, at 08:47, Guido van Rossum guido@python.org wrote:
You seem to have missed Nick's posts where he clearly accepts that a
middle ground is necessary. R D Murray is also still unconvinced. (And obviously I myself am against reverting to the behavior from 7 years ago.) If we can't agree on some middle ground, the status quo will be maintained.
I haven’t seen a response to my suggestion, so it’s possible that it got missed in the flurry. With coordination with setuptools, we could:
- Re-enable DeprecationWarning by default
- Add a simplified API for specifically silencing DeprecationWarnings
- Modify setuptools to call this API for generated entry point scripts
I think this would mean that most application users would still not see the warnings. The simplified API would be available for handcrafted scripts to call to accomplish the same thing the setuptools enhancement would provide. Developers would see DeprecationWarnings in their development and test environments.
The simplified API would be the equivalent of ignore::DeprecationWarning, so with some additional documentation even versions of applications running on versions of Python < 3.7 would still have an “out”. (Yes, the simplified API is just a convenience moving forward.)
Cheers, -Barry
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/ guido%40python.org

On Nov 8, 2017, at 12:02, Guido van Rossum guido@python.org wrote:
I hadn't seen that, but it requires too much cooperation of library owners.
Actually, mostly just setuptools and as Paul points out, pip.
Cheers, -Barry

On 8 November 2017 at 18:56, Barry Warsaw barry@python.org wrote:
On Nov 8, 2017, at 08:47, Guido van Rossum guido@python.org wrote:
You seem to have missed Nick's posts where he clearly accepts that a middle ground is necessary. R D Murray is also still unconvinced. (And obviously I myself am against reverting to the behavior from 7 years ago.) If we can't agree on some middle ground, the status quo will be maintained.
I haven’t seen a response to my suggestion, so it’s possible that it got missed in the flurry. With coordination with setuptools, we could:
- Re-enable DeprecationWarning by default
- Add a simplified API for specifically silencing DeprecationWarnings
- Modify setuptools to call this API for generated entry point scripts
I think this would mean that most application users would still not see the warnings. The simplified API would be available for handcrafted scripts to call to accomplish the same thing the setuptools enhancement would provide. Developers would see DeprecationWarnings in their development and test environments.
The simplified API would be the equivalent of ignore::DeprecationWarning, so with some additional documentation even versions of applications running on versions of Python < 3.7 would still have an “out”. (Yes, the simplified API is just a convenience moving forward.)
pip uses distutils for its script wrappers, but uses its own script template, so it'd need a pip change too (which means it'd be in pip 10 but not earlier versions).
Paul

On 9 November 2017 at 04:56, Barry Warsaw barry@python.org wrote:
On Nov 8, 2017, at 08:47, Guido van Rossum guido@python.org wrote:
You seem to have missed Nick's posts where he clearly accepts that a middle ground is necessary. R D Murray is also still unconvinced. (And obviously I myself am against reverting to the behavior from 7 years ago.) If we can't agree on some middle ground, the status quo will be maintained.
I haven’t seen a response to my suggestion, so it’s possible that it got missed in the flurry. With coordination with setuptools, we could:
- Re-enable DeprecationWarning by default
- Add a simplified API for specifically silencing DeprecationWarnings
- Modify setuptools to call this API for generated entry point scripts
I think this would mean that most application users would still not see the warnings. The simplified API would be available for handcrafted scripts to call to accomplish the same thing the setuptools enhancement would provide. Developers would see DeprecationWarnings in their development and test environments.
The simplified API would be the equivalent of ignore::DeprecationWarning, so with some additional documentation even versions of applications running on versions of Python < 3.7 would still have an “out”. (Yes, the simplified API is just a convenience moving forward.)
I did see that, but I think a "once::DeprecationWarning:__main__" filter provides a comparable benefit in a simpler way, as the recommended idiom to turn off deprecation warnings at runtime becomes:
from elsewhere import main
if __name__ == "__main__": import sys sys.exit(main(sys.argv))
That same idiom will then work for:
* entry point wrapper scripts * __main__ submodules in executable packages * __main__.py files in executable directories and zip archives
And passing "-Wd" will still correctly override the default filter set.
It doesn't resolve the problem Nathaniel pointed out that "stacklevel" can be hard to set correctly when emitting a warning (especially at import time), but it also opens a new way of dealing with that: using warnings.warn_explicit to claim that the reporting module is "__main__" if you want to increase the chances of the warning being seen by default.
Cheers, Nick.

Big +1 to turning warnings on by default again.
When this behaviour first started, first I was surprised, then annoyed that warnings were being suppressed. For a few years I learned to have `export PYTHONWARNINGS=default` in my .bashrc.
But eventually, the warnings in 3rd-party Python modules gradually increased because, since warnings are disabled by default, authors of command-line tools, or even python modules, don't even realise they are triggering the warning.
So developers stop fixing warnings because they are hidden. Things get worse and worse over the years. Eventually I got fed up and removed the PYTHONWARNINGS env var.
Showing warnings by default is good: 1. End users who don't understand what those warnings are are unlikely to see them since they don't command-line tools at all; 2. The users that do see them are sufficiently proficient to be able to submit a bug report; 3. If you file a bug report in tool that uses a 3rd party module, the author of that tool should open a corresponding bug report on the 3rd party module that actually caused the warning; 4. Given time, all the bug reports trickle down and create pressure on module maintainers to fix warnings; 5. If a module is being used and has no maintainer, it's a good indication it is time to fork it or find an alternative.
Not fixing warnings is a form of technical debt that we should not encourage. It is not the Python way.
On 6 November 2017 at 02:05, Nick Coghlan ncoghlan@gmail.com wrote:
On the 12-weeks-to-3.7-feature-freeze thread, Jose Bueno & I both mistakenly though the async/await deprecation warnings were missing from 3.6.
They weren't missing, we'd just both forgotten those warnings were off by default (7 years after the change to the default settings in 2.7 & 3.2).
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default, and app devs having to silence it explicitly during application startup (before they start importing third party modules) if they don't want their users seeing it when running on the latest Python version (e.g. this would be suitable for open source apps that get integrated into Linux distros and use the system Python there).
This will also restore the previously clear semantic and behavioural different between PendingDeprecationWarning (hidden by default) and DeprecationWarning (visible by default).
As part of this though, I'd suggest amending the documentation for DeprecationWarning [1] to specifically cover how to turn it off programmatically (`warnings.simplefilter("ignore", DeprecationWarning)`), at the command line (`python -W ignore::DeprecationWarning ...`), and via the environment (`PYTHONWARNINGS=ignore::DeprecationWarning`).
(Structurally, I'd probably put that at the end of the warnings listing as a short introduction to warnings management, with links out to the relevant sections of the documentation, and just use DeprecationWarning as the specific example)
Cheers, Nick.
[1] https://docs.python.org/3/library/exceptions.html#DeprecationWarning
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/ gjcarneiro%40gmail.com

Hi! Just this minute I ran across a case where I’d want DeprecationWarnings on by default
(We want to rename a property in an API I’m co-developing. I has mainly scientists as target audience, so end users, not developers)
Gustavo Carneiro gjcarneiro@gmail.com schrieb am Mo., 6. Nov. 2017 um 15:19 Uhr:
Big +1 to turning warnings on by default again.
When this behaviour first started, first I was surprised, then annoyed that warnings were being suppressed. For a few years I learned to have `export PYTHONWARNINGS=default` in my .bashrc.
But eventually, the warnings in 3rd-party Python modules gradually increased because, since warnings are disabled by default, authors of command-line tools, or even python modules, don't even realise they are triggering the warning.
So developers stop fixing warnings because they are hidden. Things get worse and worse over the years. Eventually I got fed up and removed the PYTHONWARNINGS env var.
Showing warnings by default is good:
- End users who don't understand what those warnings are are unlikely to
see them since they don't command-line tools at all; 2. The users that do see them are sufficiently proficient to be able to submit a bug report; 3. If you file a bug report in tool that uses a 3rd party module, the author of that tool should open a corresponding bug report on the 3rd party module that actually caused the warning; 4. Given time, all the bug reports trickle down and create pressure on module maintainers to fix warnings; 5. If a module is being used and has no maintainer, it's a good indication it is time to fork it or find an alternative.
Not fixing warnings is a form of technical debt that we should not encourage. It is not the Python way.
On 6 November 2017 at 02:05, Nick Coghlan ncoghlan@gmail.com wrote:
On the 12-weeks-to-3.7-feature-freeze thread, Jose Bueno & I both mistakenly though the async/await deprecation warnings were missing from 3.6.
They weren't missing, we'd just both forgotten those warnings were off by default (7 years after the change to the default settings in 2.7 & 3.2).
So my proposal is simple (and not really new): let's revert back to the way things were in 2.6 and earlier, with DeprecationWarning being visible by default, and app devs having to silence it explicitly during application startup (before they start importing third party modules) if they don't want their users seeing it when running on the latest Python version (e.g. this would be suitable for open source apps that get integrated into Linux distros and use the system Python there).
This will also restore the previously clear semantic and behavioural different between PendingDeprecationWarning (hidden by default) and DeprecationWarning (visible by default).
As part of this though, I'd suggest amending the documentation for DeprecationWarning [1] to specifically cover how to turn it off programmatically (`warnings.simplefilter("ignore", DeprecationWarning)`), at the command line (`python -W ignore::DeprecationWarning ...`), and via the environment (`PYTHONWARNINGS=ignore::DeprecationWarning`).
(Structurally, I'd probably put that at the end of the warnings listing as a short introduction to warnings management, with links out to the relevant sections of the documentation, and just use DeprecationWarning as the specific example)
Cheers, Nick.
[1] https://docs.python.org/3/library/exceptions.html#DeprecationWarning
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/gjcarneiro%40gmail.com
-- Gustavo J. A. M. Carneiro Gambit Research "The universe is always one step beyond logic." -- Frank Herbert _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/flying-sheep%40web.de

On 6 November 2017 at 14:16, Gustavo Carneiro gjcarneiro@gmail.com wrote:
But eventually, the warnings in 3rd-party Python modules gradually increased because, since warnings are disabled by default, authors of command-line tools, or even python modules, don't even realise they are triggering the warning.
So developers stop fixing warnings because they are hidden. Things get worse and worse over the years. Eventually I got fed up and removed the PYTHONWARNINGS env var.
Maybe it's worth running the test suites of a number of major packages like pip, requests, django, ... with warnings switched on, to see what the likely impact of making warnings display by default would be on those projects? Hopefully, it's zero, but hard data is always better than speculation :-)
Paul

I'm -1 on turning this on by default.
As a Python developer, I want to be aware of when deprecations are introduced, but I don't want the users of my library or application to care or know if I don't address those deprecation warnings for a few months or a year. The right solution for me here seems to enable the warnings in CI pipelines / tests.
As an end user, if I see deprecation warnings there's nothing I can really do to make them go away straight away except run Python with warnings turned off which seems to defeat the point of turning them on by default. The right solution here seems to be for authors to test their software before releasing.
I'm -2 on a complicated rule for when warnings are on because I'm going to forget the rule a week from now and probably no one I work with on a day to day basis will even know what the rule was to start with.
Maybe there are ways around these things, but I'm not really seeing what's wrong with the current situation that can't be fixed with slightly better CI setups (which are good for other reasons too).
Schiavo Simon

On 7 November 2017 at 03:42, Simon Cross hodgestar+pythondev@gmail.com wrote:
Maybe there are ways around these things, but I'm not really seeing what's wrong with the current situation that can't be fixed with slightly better CI setups (which are good for other reasons too).
Given the status quo, how do educators learn that the examples they're teaching to their students are using deprecated APIs?
Cheers, Nick.

On 7 November 2017 at 04:09, Nick Coghlan ncoghlan@gmail.com wrote:
Given the status quo, how do educators learn that the examples they're teaching to their students are using deprecated APIs?
By reading the documentation on what they are teaching, and by testing their examples with new versions with deprecation warnings turned on? Better than having warnings appear the first time they run a course with a new version of Python, surely?
I understand the "but no-one actually does this" argument. And I understand that breakage as a result is worse than a few warnings. But enabling deprecation warnings by default feels to me like favouring the developer over the end user. I remember before the current behaviour was enabled and it was *immensely* frustrating to try to use 3rd party code and get a load of warnings. The only options were:
1. Report the bug - usually not much help, as I want to run the program *now*, not when a new release is made. 2. Fix the code (and ideally submit a PR upstream) - I want to *use* the program, not debug it. 3. Find the right setting/environment variable, and tweak how I call the program to apply it - which doesn't fix the root cause, it's just a workaround.
I appreciate that this is open source, and using free programs comes with an obligation to contribute back or deal with issues like this, but even so, it's a pretty bad user experience.
I'd prefer it if rather than simply switching warnings on by default, we worked on making it easier for the people in a position to actually *fix* the issue (coders writing programs, educators developing training materials, etc) to see the warnings. For example, encourage the various testing frameworks (unittest, pytest, nose, tox, ...) to enable warnings by default, promote "test with warnings enabled" in things like the packaging guide, ensure that all new deprecations are documented in the "Porting to Python 3.x" notes, etc.
Paul

On Tue, 7 Nov 2017 09:30:19 +0000 Paul Moore p.f.moore@gmail.com wrote:
I understand the "but no-one actually does this" argument. And I understand that breakage as a result is worse than a few warnings. But enabling deprecation warnings by default feels to me like favouring the developer over the end user.
I understand this characterization.
I'd prefer it if rather than simply switching warnings on by default, we worked on making it easier for the people in a position to actually *fix* the issue (coders writing programs, educators developing training materials, etc) to see the warnings. For example, encourage the various testing frameworks (unittest, pytest, nose, tox, ...) to enable warnings by default,
pytest does nowadays. That doesn't mean warnings get swiftly fixed, though. There are many reasons why (see my initial reply to Nick's proposal).
ensure that all new deprecations are documented in the "Porting to Python 3.x" notes, etc.
In my experience, Python deprecations are in the minority. Most often you have to deal with deprecations in third-party libraries rather than Python core/stdlib, because we (Python) are more reluctant to change and deprecate APIs than the average library maintainer is.
Regards
Antoine.

On 7 November 2017 at 19:30, Paul Moore p.f.moore@gmail.com wrote:
On 7 November 2017 at 04:09, Nick Coghlan ncoghlan@gmail.com wrote:
Given the status quo, how do educators learn that the examples they're teaching to their students are using deprecated APIs?
By reading the documentation on what they are teaching, and by testing their examples with new versions with deprecation warnings turned on? Better than having warnings appear the first time they run a course with a new version of Python, surely?
I understand the "but no-one actually does this" argument. And I understand that breakage as a result is worse than a few warnings. But enabling deprecation warnings by default feels to me like favouring the developer over the end user. I remember before the current behaviour was enabled and it was *immensely* frustrating to try to use 3rd party code and get a load of warnings. The only options were:
- Report the bug - usually not much help, as I want to run the
program *now*, not when a new release is made. 2. Fix the code (and ideally submit a PR upstream) - I want to *use* the program, not debug it. 3. Find the right setting/environment variable, and tweak how I call the program to apply it - which doesn't fix the root cause, it's just a workaround.
Yes, this is why I've come around to the view that we need to come up with a viable definition of "third party code" and leave deprecation warnings triggered by that code disabled by default.
My suggestion for that definition is to have the *default* meaning of "third party code" be "everything that isn't __main__".
That way, if you get a deprecation warning at the REPL, it's necessarily because of something *you* did, not because of something a library you called did. Ditto for single file scripts.
We'd then offer some straightforward interfaces for people to say "Please also report legacy calls from 'module' as warnings".
You'd still get less-than-helpful warnings if you were running a single file script that someone *else* wrote (rather than one you wrote yourself), but that's an inherent flaw in that distribution model: as soon as you ask people to supply their own Python runtime, you're putting them in the position of acting as an application integrator (finding a combination of Python language runtime and your script that actually work together), rather than as a regular software user.
Cheers, Nick.

Sorry, I still don’t understand how any of this is a problem.
1. If you’re an application developer, google “python disable DeprecationWarning” and paste the code you found, so your users don’t see the warnings. 2. If you’re a library developer, and a library you depend on raises DeprecationWarnings without it being your fault, file an issue/bug there.
For super-increased convenience in case 2., we could also add a convenience API that blocks deprecation warnings raised from certain module or its submodules. Best, Philipp
Nick Coghlan ncoghlan@gmail.com schrieb am Di., 7. Nov. 2017 um 13:25 Uhr:
On 7 November 2017 at 19:30, Paul Moore p.f.moore@gmail.com wrote:
On 7 November 2017 at 04:09, Nick Coghlan ncoghlan@gmail.com wrote:
Given the status quo, how do educators learn that the examples they're teaching to their students are using deprecated APIs?
By reading the documentation on what they are teaching, and by testing their examples with new versions with deprecation warnings turned on? Better than having warnings appear the first time they run a course with a new version of Python, surely?
I understand the "but no-one actually does this" argument. And I understand that breakage as a result is worse than a few warnings. But enabling deprecation warnings by default feels to me like favouring the developer over the end user. I remember before the current behaviour was enabled and it was *immensely* frustrating to try to use 3rd party code and get a load of warnings. The only options were:
- Report the bug - usually not much help, as I want to run the
program *now*, not when a new release is made. 2. Fix the code (and ideally submit a PR upstream) - I want to *use* the program, not debug it. 3. Find the right setting/environment variable, and tweak how I call the program to apply it - which doesn't fix the root cause, it's just a workaround.
Yes, this is why I've come around to the view that we need to come up with a viable definition of "third party code" and leave deprecation warnings triggered by that code disabled by default.
My suggestion for that definition is to have the *default* meaning of "third party code" be "everything that isn't __main__".
That way, if you get a deprecation warning at the REPL, it's necessarily because of something *you* did, not because of something a library you called did. Ditto for single file scripts.
We'd then offer some straightforward interfaces for people to say "Please also report legacy calls from 'module' as warnings".
You'd still get less-than-helpful warnings if you were running a single file script that someone *else* wrote (rather than one you wrote yourself), but that's an inherent flaw in that distribution model: as soon as you ask people to supply their own Python runtime, you're putting them in the position of acting as an application integrator (finding a combination of Python language runtime and your script that actually work together), rather than as a regular software user.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/flying-sheep%40web.de

On 7 November 2017 at 13:35, Philipp A. flying-sheep@web.de wrote:
Sorry, I still don’t understand how any of this is a problem.
If you’re an application developer, google “python disable DeprecationWarning” and paste the code you found, so your users don’t see the warnings. If you’re a library developer, and a library you depend on raises DeprecationWarnings without it being your fault, file an issue/bug there.
For super-increased convenience in case 2., we could also add a convenience API that blocks deprecation warnings raised from certain module or its submodules. Best, Philipp
If you're a user and your application developer didn't do (1) or a library developer developing one of the libraries your application developer chose to use didn't do (2), you're hosed. If you're a user who works in an environment where moving to a new version of the application is administratively complex, you're hosed.
As I say, the proposal prioritises developer convenience over end user experience. Paul

On 7 November 2017 at 23:44, Paul Moore p.f.moore@gmail.com wrote:
As I say, the proposal prioritises developer convenience over end user experience.
Users of applications written in Python are not python-dev's users: they're the users of those applications, and hence the quality of that experience is up to the developers of those applications. This is no different from the user experience of Instagram being Facebook's problem, the user experience of RHEL being Red Hat's problem, the user experience of YouTube being Google's problem, etc.
*python-dev's* users are developers, data analysts, educators, and so forth that are actually writing Python code, and at the moment we're making it hard for them to be suitably forewarned of upcoming breaking changes - they have to know the secret knock that says "I'd like to be warned about future breaking changes, please". Sure, a lot of people do learn what that knock is, and they often even remember to ask for it, but the entire reason this thread started was because *I* forgot that I needed to run "python3 -Wd" in order to check for async/await deprecation warnings in 3.6, and incorrectly assumed that their absence meant we'd forgotten to include them.
Cheers, Nick.

Nick Coghlan ncoghlan@gmail.com schrieb am Di., 7. Nov. 2017 um 14:57 Uhr:
Users of applications written in Python are not python-dev's users: they're the users of those applications, and hence the quality of that experience is up to the developers of those applications. […]
Thank you, that’s exactly what I’m talking about. Besides: Nobody is “hosed”… There will be one occurrence of every DeprecationWarning in the stderr of the application. Hardly the end of the world for CLI applications and even invisible for GUI applications.
If the devs care about the user not seeing any warnings in their CLI application, they’ll have a test set up for that, which will tell them that the newest python-dev would raise a new warning, once they turn on testing for that release. That’s completely fine!
Explicit is better than implicit! If I know lib X raises DeprecationWarnings I don’t care about, I want to explicitly silence them, instead of missing out on all the valuable information in other DeprecationWarnings.
Best, Philipp

On 2017-11-07 14:17, Philipp A. wrote:
Nick Coghlan <ncoghlan@gmail.com mailto:ncoghlan@gmail.com> schrieb am Di., 7. Nov. 2017 um 14:57 Uhr:
Users of applications written in Python are not python-dev's users: they're the users of those applications, and hence the quality of that experience is up to the developers of those applications. […]
Thank you, that’s exactly what I’m talking about. Besides: Nobody is “hosed”… There will be one occurrence of every DeprecationWarning in the stderr of the application. Hardly the end of the world for CLI applications and even invisible for GUI applications.
If the devs care about the user not seeing any warnings in their CLI application, they’ll have a test set up for that, which will tell them that the newest python-dev would raise a new warning, once they turn on testing for that release. That’s completely fine!
Explicit is better than implicit! If I know lib X raises DeprecationWarnings I don’t care about, I want to explicitly silence them, instead of missing out on all the valuable information in other DeprecationWarnings.
Also, errors should never pass silently. Deprecation warnings are future errors.

On 11/07/2017 05:44 AM, Paul Moore wrote:
If you're a user and your application developer didn't do (1) or a library developer developing one of the libraries your application developer chose to use didn't do (2), you're hosed. If you're a user who works in an environment where moving to a new version of the application is administratively complex, you're hosed.
Suffering from DeprecationWarnings is not "being hosed". Having your script/application/framework suddenly stop working because nobody noticed something was being deprecated is "being hosed".
+1 to turn them back on.
-- ~Ethan~

On 7 November 2017 at 14:21, Ethan Furman ethan@stoneleaf.us wrote:
On 11/07/2017 05:44 AM, Paul Moore wrote:
If you're a user and your application developer didn't do (1) or a library developer developing one of the libraries your application developer chose to use didn't do (2), you're hosed. If you're a user who works in an environment where moving to a new version of the application is administratively complex, you're hosed.
Suffering from DeprecationWarnings is not "being hosed". Having your script/application/framework suddenly stop working because nobody noticed something was being deprecated is "being hosed".
OK, I overstated. Apologies. My recollection is of a lot more end user complaints when deprecation warnings were previously switched on than others seem to remember, but I can't find hard facts, so I'll assume I'm misremembering.
Paul

Ethan Furman writes:
Suffering from DeprecationWarnings is not "being hosed". Having your script/application/framework suddenly stop working because nobody noticed something was being deprecated is "being hosed".
OK, so suffering from DeprecationWarnings is not "being hosed". Nevertheless, it's a far greater waste of my time (supervising students in business and economics with ~50% annual turnover) than is "suddenly stop working", even though it only takes 1-5 minutes each time to explain how to do whatever seems appropriate. "Suddenly stopped working", in fact, hasn't happened to me yet in that environment.
It's not hard to understand why: the student downloads Python, and doesn't upgrade within the life cycle of the software they've written. It becomes obsolete upon graduation, and is archived, never to be used again. I don't know how common this kind of environment is, so I can't say it's terribly important, but AIUI Python should be pleasant to use in this context.
Unfortunately I have no time to contribute code or even useful ideas to the end of making it more likely that Those Who Can Do Something (a) see the DeprecationWarning and (b) are made sufficiently itchy that they actually scratch, and that Those Who Cannot Do Anything, or are limited to suggesting that something be done, not see it.
So I'll shut up now, having contributed this user story.
Steve

On Nov 7, 2017, at 05:44, Paul Moore p.f.moore@gmail.com wrote:
If you're a user and your application developer didn't do (1) or a library developer developing one of the libraries your application developer chose to use didn't do (2), you're hosed. If you're a user who works in an environment where moving to a new version of the application is administratively complex, you're hosed.
“hosed” feels like too strong of a word here. DeprecationWarnings usually don’t break anything. Sure, they’re annoying but they can usually be easily ignored.
Yes, there are some situations where DWs do actively break things (as I’ve mentioned, some Debuntu build/test environments). But those are also relatively easier to silence, or at least the folks running those environments, or writing the code for those environments, are usually more advanced developers for whom setting an environment variable or flag isn’t that big of a deal.
Cheers, -Barry

On 8 November 2017 at 03:55, Barry Warsaw barry@python.org wrote:
On Nov 7, 2017, at 05:44, Paul Moore p.f.moore@gmail.com wrote:
If you're a user and your application developer didn't do (1) or a library developer developing one of the libraries your application developer chose to use didn't do (2), you're hosed. If you're a user who works in an environment where moving to a new version of the application is administratively complex, you're hosed.
“hosed” feels like too strong of a word here. DeprecationWarnings usually don’t break anything. Sure, they’re annoying but they can usually be easily ignored.
Yes, there are some situations where DWs do actively break things (as I’ve mentioned, some Debuntu build/test environments). But those are also relatively easier to silence, or at least the folks running those environments, or writing the code for those environments, are usually more advanced developers for whom setting an environment variable or flag isn’t that big of a deal.
One other case would be if you've got an application with no stderr (e.g. a GUI application) - with enough deprecation warnings the stderr buffer could become full and block, preventing the application from progressing. I've just had a similar issue where a process was running as a service and used subprocess.check_output() - stderr was written to the parent's stderr, which didn't exist and caused the program to hang.
However, I'm definitely +1 on enabling DeprecationWarning by default, but with mechanisms or recommendations for the application developer to silence them selectively for the current release.
Tim Delaney

On Nov 7, 2017 06:24, "Nick Coghlan" ncoghlan@gmail.com wrote:
On 7 November 2017 at 19:30, Paul Moore p.f.moore@gmail.com wrote:
On 7 November 2017 at 04:09, Nick Coghlan ncoghlan@gmail.com wrote:
Given the status quo, how do educators learn that the examples they're teaching to their students are using deprecated APIs?
By reading the documentation on what they are teaching, and by testing their examples with new versions with deprecation warnings turned on? Better than having warnings appear the first time they run a course with a new version of Python, surely?
I understand the "but no-one actually does this" argument. And I understand that breakage as a result is worse than a few warnings. But enabling deprecation warnings by default feels to me like favouring the developer over the end user. I remember before the current behaviour was enabled and it was *immensely* frustrating to try to use 3rd party code and get a load of warnings. The only options were:
- Report the bug - usually not much help, as I want to run the
program *now*, not when a new release is made. 2. Fix the code (and ideally submit a PR upstream) - I want to *use* the program, not debug it. 3. Find the right setting/environment variable, and tweak how I call the program to apply it - which doesn't fix the root cause, it's just a workaround.
Yes, this is why I've come around to the view that we need to come up with a viable definition of "third party code" and leave deprecation warnings triggered by that code disabled by default.
My suggestion for that definition is to have the *default* meaning of "third party code" be "everything that isn't __main__".
That way, if you get a deprecation warning at the REPL, it's necessarily because of something *you* did, not because of something a library you called did. Ditto for single file scripts.
IPython actually made this change a few years ago; since 2015 I think it has shown DeprecationWarnings by default if they're triggered by __main__.
It's helpful but I haven't noticed it eliminating this problem. One limitation in particular is that it requires that the warnings are correctly attributed to the code that triggered them, which means that whoever is issuing the warning has to set the stacklevel= correctly, and most people don't. (The default of stacklevel=1 is always wrong for DeprecationWarning.) Also, IIRC it's actually impossible to set the stacklevel= correctly when you're deprecating a whole module and issue the warning at import time, because you need to know how many stack frames the import system uses.
-n

On Tue, Nov 7, 2017 at 8:45 AM, Nathaniel Smith njs@pobox.com wrote:
Also, IIRC it's actually impossible to set the stacklevel= correctly when you're deprecating a whole module and issue the warning at import time, because you need to know how many stack frames the import system uses.
Doh, I didn't remember correctly. Actually Brett fixed this in 3.5: https://bugs.python.org/issue24305
-n

On Tue, Nov 7, 2017, at 07:22, Nick Coghlan wrote:
My suggestion for that definition is to have the *default* meaning of "third party code" be "everything that isn't __main__".
What is __main__? Or, rather, how do you determine when it is to blame? For syntax it's easy, but any deprecated function necessarily belongs to its own module and not to main. Main may have called it, which can be detected from the stack trace, or it may have used it in some other way (pass to some builtin or e.g. itertools function that takes a callable argument, for example). Maybe the DeprecationWarning should be raised at the name lookup* rather than the call? What if "calling this function with some particular combination of arguments" is deprecated?
*i.e. something like:
class deprecated: def __init__(self, obj): self.obj = obj class DeprecatableModule(ModuleType): def __getattr__(self, name): obj = self.__dict__[name] if isinstance(type(obj), deprecated): if (detect somehow caller is __main__): raise DeprecationWarning return obj.obj else: return obj def __dir__(self): return [k for k in self.__dict__ if not isinstance(self.__dict__[k], deprecated)]
sys.modules[__name__].type=DeprecatableModule
@deprecated def some_deprecated_function(...): ...
SOME_DEPRECATED_CONSTANT = deprecated(42)

On 11 November 2017 at 02:02, Random832 random832@fastmail.com wrote:
On Tue, Nov 7, 2017, at 07:22, Nick Coghlan wrote:
My suggestion for that definition is to have the *default* meaning of "third party code" be "everything that isn't __main__".
What is __main__? Or, rather, how do you determine when it is to blame? For syntax it's easy, but any deprecated function necessarily belongs to its own module and not to main. Main may have called it, which can be detected from the stack trace, or it may have used it in some other way (pass to some builtin or e.g. itertools function that takes a callable argument, for example).
The warnings machinery already defines how this works (look for "stacklevel"). For callbacks defined as Python code, the deprecated call will be attributed to whichever module defined the callback, not the machinery that called the callback.
Cheers, Nick.

Nick Coghlan wrote:
On the 12-weeks-to-3.7-feature-freeze thread, Jose Bueno & I both mistakenly though the async/await deprecation warnings were missing from 3.6.
Sometimes the universe just throws synchronicity right in your face.
I'm working on building an internal tool against Python 3.7 to take advantage of the very cool -X importtime feature. It's been a fun challenge, but mostly because of our external dependencies. For example, PyThreadState renamed its structure members, so both Cython and lxml needed new releases to adjust for this. That's the "easy" part; they've done it and those fixes work great.
We also depend on ldap3 https://pypi.org/project/ldap3/. Suddenly we get a SyntaxError because ldap3 has a module ldap3/strategy/async.py. I say "suddenly" because of course *if* DeprecationWarnings had been enabled by default, I'm sure someone would have noticed that those imports were telling the developers about the impending problem in Python 3.6.
https://github.com/cannatag/ldap3/issues/428
This just reinforces my opinion that even though printing DeprecationWarning by default *can* be a hardship in some environments, it is on the whole a positive beneficial indicator that gives developers some runway to fix such problems. These types of apparently sudden breakages are the worse of all worlds.
Cheers, -Barry

* Barry Warsaw barry@python.org, 2017-11-06, 15:56:
We also depend on ldap3 https://pypi.org/project/ldap3/. Suddenly we get a SyntaxError because ldap3 has a module ldap3/strategy/async.py. I say "suddenly" because of course *if* DeprecationWarnings had been enabled by default, I'm sure someone would have noticed that those imports were telling the developers about the impending problem in Python 3.6.
"import async" would indeed cause deprecation warning, but that's not what ldap3 does. The only uses of the now-keyword "async" in their codebase are like this:
from ..strategy.async import AsyncStrategy from .async import AsyncStrategy
These do not provoke deprecation warnings from Python 3.6. (They probably should!)
I'm afraid that showing deprecation warnings by default wouldn't have helped in this particular case.

On Nov 7, 2017, at 13:34, Jakub Wilk jwilk@jwilk.net wrote:
"import async" would indeed cause deprecation warning, but that's not what ldap3 does. The only uses of the now-keyword "async" in their codebase are like this:
from ..strategy.async import AsyncStrategy from .async import AsyncStrategy
These do not provoke deprecation warnings from Python 3.6. (They probably should!)
I'm afraid that showing deprecation warnings by default wouldn't have helped in this particular case.
Oh gosh, I should have tried that instead of assuming it would generate the same warning. Yes, that’s definitely a bug. I wonder if we should push back making async/await reserved words until Python 3.8?
https://bugs.python.org/issue31973
Cheers, -Barry
participants (23)
-
Antoine Pitrou
-
Barry Warsaw
-
Brett Cannon
-
Eric V. Smith
-
Ethan Furman
-
Guido van Rossum
-
Gustavo Carneiro
-
Jakub Wilk
-
Joao S. O. Bueno
-
Lukasz Langa
-
MRAB
-
Nathaniel Smith
-
Nick Coghlan
-
Oleg Broytman
-
Paul Moore
-
Philipp A.
-
Random832
-
Serhiy Storchaka
-
Simon Cross
-
Stephen J. Turnbull
-
Tim Delaney
-
Victor Stinner
-
Yury Selivanov