Hi all,
It's finally time to schedule the last releases in Python 2's life. There will be two more releases of Python 2.7: Python 2.7.17 and Python 2.7.18.
Python 2.7.17 release candidate 1 will happen on October 5th followed by the final release on October 19th.
I'm going to time Python 2.7.18 to coincide with PyCon 2020 in April, so attendees can enjoy some collective catharsis. We'll still say January 1st is the official EOL date.
Thanks to Sumana Harihareswara, there's now a FAQ about the Python 2 sunset on the website: https://www.python.org/doc/sunset-python-2/
Regards,
Benjamin
Hi all,
I think that PEP 573 is ready to be accepted, to greatly improve the state
of extension modules in CPython 3.9.
https://www.python.org/dev/peps/pep-0573/
It has come a long way since the original proposal and went through several
iterations and discussions by various interested people, effectively
reducing its scope quite a bit. So this is the last call for comments on
the latest version of the PEP, before I will pronounce on it. Please keep
the discussion in this thread.
Stefan
An unfortunate side-effect of making PyInterpreterState in Python 3.8 opaque is it removed [PEP 523](https://www.python.org/dev/peps/pep-0523/) support. https://www.python.org/dev/peps/pep-0523/ was opened to try and fix this, but there seems to be a stalemate in the issue.
A key question is at what API level should setting the frame evaluation function live at? No one is suggesting the stable ABI, but there isn't agreement between CPython or the internal API (there's also seems to be a suggestion to potentially remove PEP 523 support entirely).
And regardless of either, there also seems to be a disagreement about providing getters and setters to continue to try and hide PyInterpreterState regardless of which API level support is provided at (if any).
If you have an opinion please weight in on the issue.
Hi,
Python 3.9 introduces many small incompatible changes which broke tons
of Python projects, including popular projects, some of them being
unmaintained but still widely used (like nose, last release in 2015).
Miro and me consider that Python 3.9 is pushing too much pressure on
projects maintainers to either abandon Python 2.7 right now (need to
update the CI, the documentation, warn users, etc.), or to introduce a
*new* compatibility layer to support Python 3.9: layer which would be
dropped as soon as Python 2.7 support will be dropped (soon-ish).
Python 3.9 is too early to accumulate so many incompatible changes on
purpose, some incompatible changes like the removal of collections
aliases to ABC should be reverted, and reapplied on Python 3.10.
Python 3.9 would be the last release which still contain compatibility
layers for Python 2.7.
Said differently, we request to maintain the small compatibility layer
in Python for one more cycle, instead of requesting every single
project maintainer to maintain it on their side. We consider that the
general maintenance burden is lower if it's kept in Python for now.
== Fedora COPR notify packages broken by Python 3.9 ==
In Python 3.9, Victor introduced tons of incompatible changes at the
beginning of the devcycle. His plan was to push as many as possible,
and later decide what to do... This time has come :-) He wrote PEP 606
"Python Compatibility Version" and we wrote PEP 608 "Coordinated
Python release", but both have been rejected. At least, it seems like
most people agreed that having a CI to get notified of broken projects
should help.
We are updating the future Fedora 33 to replace Python 3.8 with Python
3.9. We are using a tool called "COPR" which is like a sandbox and can
be seen as the CI discussed previously. It rebuilds Fedora using
Python 3.9 as /usr/bin/python3 (and /usr/bin/python !). We now have a
good overview of broken packages and which incompatible changes broke
most packages.
https://fedoraproject.org/wiki/Changes/Python3.9
- Describes the Fedora change.
https://copr.fedorainfracloud.org/coprs/g/python/python3.9/monitor/
- Has package failures. Some packages fail because of broken dependencies.
https://bugzilla.redhat.com/showdependencytree.cgi?id=PYTHON39
- Has open Python 3.9 bug reports for Fedora packages. Some problems
have been fixed upstream already before reaching Fedora, most are only
fixed when the Fedora maintainers report the problems back to upstream
maintainers.
Right now, there are 150+ packages broken by Python 3.9 incompatible changes.
== Maintenance burden ==
Many Python projects have not yet removed Python 2 support and Python
2 CI. It's not that they would be in the "we will not drop Python 2
support ever" camp, it's just that they have not done it yet. Removing
Python 2 compatibility code from the codebase and removing it from the
documentation and metadata and CI is a boring task, doesn't bring
anything to users, and it might take a new major release of the
library. At this point, we are very early in 2020 to expect most
projects to have already done this.
At the same time, we would like them to support Python 3.9 as soon in
the release cycle as possible. By removing Python 2 compatibility
layers from the 3.9 standard library, we are forcing the projects
maintainers to re-invent their own compatibility layers and copy-paste
stuff like this around. Example:
try:
from collections.abc import Sequence
except ImprotError:
# Python 2.7 doesn't have collections.abc
from collections import Sequence
While if we remove collections.Sequence in 3.10, they will face this
decision early in 2021 and they will simply fix their code by adding
".abc" at the proper places, not needing any more compatibility
layers. Of course, there will be projects that will still have
declared Python 2 support in 2021, but it will arguably not be that
many.
While it's certainly tempting to have "more pure" code in the standard
library, maintaining the compatibility shims for one more release
isn't really that big of a maintenance burden, especially when
comparing with dozens (hundreds?) of third party libraries essentially
maintaining their own.
An good example of a broken package is the nose project which is no
longer maintained (according to their website): the last release was
in 2015. It remains a very popular test runner. According to
libraries.io, it has with 3 million downloads per month, 41.7K
dependent repositories and 325 dependent packages. We patched nose in
Fedora to fix Python 3.5, 3.6, 3.8 and now 3.9 compatibility issues.
People installing nose from PyPI with "pip install" get the
unmaintained flavor which is currently completely broken on Python
3.9.
Someone should take over the nose project and maintain it again, or
every single project using nose should pick another tool (unittest,
nose2, pytest, whatever else). Both options will take a lot of time.
== Request to revert some incompatible changes ==
See https://docs.python.org/dev/whatsnew/3.9.html#removed
Incompatible changes which require "if <python3>: (...) else: (...)"
or "try: <python3 code> except (...): <python2 code>":
* Removed tostring/fromstring methods in array.array and base64 modules
* Removed collections aliases to ABC classes
* Removed fractions.gcd() function (which is similar to math.gcd())
* Remove "U" mode of open(): having to use io.open() just for Python 2
makes the code uglier
* Removed old plistlib API: 2.7 doesn't have the new API
== Kept incompatible changes ==
Ok-ish incompatible changes (mostly only affects compatiblity
libraries like six):
* _dummy_thread and dummy_threading modules removed: broke six, nine
and future projects. six and nine are already fixed for 3.9.
OK incompatible changes (can be replaced by the same code on 2.7 and 3.9):
* isAlive() method of threading.Thread has been removed:
Thread.is_alive() method is available in 2.7 and 3.9
* xml.etree.ElementTree.getchildren() and
xml.etree.ElementTree.getiterator() methods are removed from 3.9, but
list()/iter() works in 2.7 and 3.9
== Call to check for DeprecationWarning in your own projects ==
You must pay attention to DeprecationWarning in Python 3.9: it will be
the last "compatibility layer" release, incompatible changes will be
reapplied to Python 3.10.
For example, you can use the development mode to see
DeprecationWarning and ResourceWarning: use the "-X dev" command line
option or set the PYTHONDEVMODE=1 environment variable. Or you can use
the PYTHONWARNINGS=default environment variable to see
DeprecationWarning.
You might even want to treat all warnings as errors to ensure that you
don't miss any when you run your test suite in your CI. You can use
PYTHONWARNINGS=error, and combine it with PYTHONDEVMODE=1.
Warnings filters can be used to ignore warnings in third party code,
see the documentation:
https://docs.python.org/dev/library/warnings.html#the-warnings-filter
-- Victor Stinner and Miro Hrončok for Fedora
Hi,
First of all I want to say that I'm very much in favour of the general
idea behind PEP 558. Defined semantics is always better than undefined
semantics :)
However, I think there are a few changes needed:
1. Don't add anything to the C API, please.
Frame attributes can be accessed via `PyObject_GetAttr[String]`.
2. Don't make the behaviour dependent on whether "tracing" is turned on.
Doing so forces debuggers to use sys.settrace which is horribly
slow. It also makes the implementation more complex, and has no benefit
AFAICT.
3. Don't store write-through proxies in the frame, but make proxies
retain a reference to the frame. This would reduce the size and
complexity of code for handling frames. Clean up of the frame would
occur naturally via reference count when all proxies have been reclaimed.
The proposed implementation is hard to reason about and I am not
confident that it will not introduce some new subtle bugs to replace the
ones it seeks to remove.
Any implementation that has functions with "Borrow" and "BreakCycles" in
their names makes me nervous.
A simpler conceptual model, which I believe could be made reliable,
would be:
No change for non-function frames (as PEP 558 currently proposes).
Each access to `frame.f_locals` (for function frames) should return a
new proxy object.
This proxy object would have dict-like, write-through semantics for
variables in the frame. For keys that do not match variable names, an
exception would be raised. This means that all proxies for a single
frame will have value equivalence; object equivalence is not needed.
I.e. for a frame `f`, `f.f_locals == f.f_locals` would be True, even
though `f.f_locals is f.f_locals` would be False.
Cheers,
Mark.
(cross posting to python-committers and python-dev)
I'm happy to announce that the signups for Python Language Summit at PyCon
2020 is now open.
Full details at: https://us.pycon.org/2020/events/languagesummit/
*TL;DR*
When: Wednesday, April 15, 2020, 9am–4pm (Note, we’re starting 1 hour
earlier than usual!)
Where: David L. Lawrence Convention Center, Pittsburgh, PA, room TBD
Sign up to attend: https://forms.gle/Fg7ayhYTaY75J1r7A (closes Feb 29th,
2020 AoE)
Sign up to discuss a topic: https://forms.gle/g4BXezH1Vcn7tLds5 (closes Feb
29th, 2020 AoE)
*Who can attend*
We welcome Python core developers, active core contributors to Python and
alternative Python implementations, and anyone else who has a topic to
discuss with core developers.
*Who can propose a discussion topic*
If you have discussion items; seeking consensus; awaiting decision on a
PEP; needing help with your core dev work; or have specific questions that
need answers from core developers, please submit a proposal. According to
last year’s feedback, our audience prefer more discussions and shorter
talks.
To get an idea of past language summits, you can read past years' coverage:
2019:
http://pyfound.blogspot.com/2019/05/the-2019-python-language-summit.html
2018: https://lwn.net/Articles/754152/
2017: https://lwn.net/Articles/723251/
This year's event will be covered by A. Jesse Jiryu Davis again, and will
be posted on PSF's blog.
<https://us.pycon.org/2020/events/languagesummit/>
Some changes to note this year:
1) We plan to start 1 hour earlier (9AM)
2) The room will have U-shaped table layout
Thanks!
Mariatta & Łukasz
Hi! I'm forwarding this on behalf of Marina Moore https://github.com/mnm678 .
- Sumana Harihareswara
---
PEP 458 ( https://www.python.org/dev/peps/pep-0458/ ) proposes using The Update Framework (TUF) to allow users of PyPI to verify that the packages they install originate from PyPI. Implementing this PEP would provide protection in the event of an attack on PyPI, its mirrors, or the network used to install packages.
We started this PEP in 2013, and have recently revised it and restarted discussion.
Recent discussion and revision of the PEP has been taking place on Discourse ( https://discuss.python.org/t/pep-458-secure-pypi-downloads-with-package-sig… ).
The PEP is ready for review and I look forward to your feedback!
Thanks,
Marina Moore
PEP 458 coauthor
(Apologies. Not sure where to ask this, and I'm not much of a C++
programmer. Maybe I should have just added a comment to the still-open
issue.)
I just noticed that Nick migrated the guts of Include/frameobject.h to
include/cpython/frameobject.h. It's not clear to me that the latter
should be #include'd directly from anywhere other than
Include/frameobject.h. If that's the case, does the extern "C" stuff
still need to be replicated in the lower level file? Won't the scope
of the extern "C" block in Include/frameobject.h be active at the
lower level?
Whatever the correct answer is, I suspect the same constraints should
apply to all Include/cpython/*.h files.
Skip