(cross posting to python-committers and python-dev)
I'm happy to announce that the signups for Python Language Summit at PyCon
2020 is now open.
Full details at: https://us.pycon.org/2020/events/languagesummit/
When: Wednesday, April 15, 2020, 9am–4pm (Note, we’re starting 1 hour
earlier than usual!)
Where: David L. Lawrence Convention Center, Pittsburgh, PA, room TBD
Sign up to attend: https://forms.gle/Fg7ayhYTaY75J1r7A (closes Feb 29th,
Sign up to discuss a topic: https://forms.gle/g4BXezH1Vcn7tLds5 (closes Feb
29th, 2020 AoE)
*Who can attend*
We welcome Python core developers, active core contributors to Python and
alternative Python implementations, and anyone else who has a topic to
discuss with core developers.
*Who can propose a discussion topic*
If you have discussion items; seeking consensus; awaiting decision on a
PEP; needing help with your core dev work; or have specific questions that
need answers from core developers, please submit a proposal. According to
last year’s feedback, our audience prefer more discussions and shorter
To get an idea of past language summits, you can read past years' coverage:
This year's event will be covered by A. Jesse Jiryu Davis again, and will
be posted on PSF's blog.
Some changes to note this year:
1) We plan to start 1 hour earlier (9AM)
2) The room will have U-shaped table layout
Mariatta & Łukasz
Python 3.9 introduces many small incompatible changes which broke tons
of Python projects, including popular projects, some of them being
unmaintained but still widely used (like nose, last release in 2015).
Miro and me consider that Python 3.9 is pushing too much pressure on
projects maintainers to either abandon Python 2.7 right now (need to
update the CI, the documentation, warn users, etc.), or to introduce a
*new* compatibility layer to support Python 3.9: layer which would be
dropped as soon as Python 2.7 support will be dropped (soon-ish).
Python 3.9 is too early to accumulate so many incompatible changes on
purpose, some incompatible changes like the removal of collections
aliases to ABC should be reverted, and reapplied on Python 3.10.
Python 3.9 would be the last release which still contain compatibility
layers for Python 2.7.
Said differently, we request to maintain the small compatibility layer
in Python for one more cycle, instead of requesting every single
project maintainer to maintain it on their side. We consider that the
general maintenance burden is lower if it's kept in Python for now.
== Fedora COPR notify packages broken by Python 3.9 ==
In Python 3.9, Victor introduced tons of incompatible changes at the
beginning of the devcycle. His plan was to push as many as possible,
and later decide what to do... This time has come :-) He wrote PEP 606
"Python Compatibility Version" and we wrote PEP 608 "Coordinated
Python release", but both have been rejected. At least, it seems like
most people agreed that having a CI to get notified of broken projects
We are updating the future Fedora 33 to replace Python 3.8 with Python
3.9. We are using a tool called "COPR" which is like a sandbox and can
be seen as the CI discussed previously. It rebuilds Fedora using
Python 3.9 as /usr/bin/python3 (and /usr/bin/python !). We now have a
good overview of broken packages and which incompatible changes broke
- Describes the Fedora change.
- Has package failures. Some packages fail because of broken dependencies.
- Has open Python 3.9 bug reports for Fedora packages. Some problems
have been fixed upstream already before reaching Fedora, most are only
fixed when the Fedora maintainers report the problems back to upstream
Right now, there are 150+ packages broken by Python 3.9 incompatible changes.
== Maintenance burden ==
Many Python projects have not yet removed Python 2 support and Python
2 CI. It's not that they would be in the "we will not drop Python 2
support ever" camp, it's just that they have not done it yet. Removing
Python 2 compatibility code from the codebase and removing it from the
documentation and metadata and CI is a boring task, doesn't bring
anything to users, and it might take a new major release of the
library. At this point, we are very early in 2020 to expect most
projects to have already done this.
At the same time, we would like them to support Python 3.9 as soon in
the release cycle as possible. By removing Python 2 compatibility
layers from the 3.9 standard library, we are forcing the projects
maintainers to re-invent their own compatibility layers and copy-paste
stuff like this around. Example:
from collections.abc import Sequence
# Python 2.7 doesn't have collections.abc
from collections import Sequence
While if we remove collections.Sequence in 3.10, they will face this
decision early in 2021 and they will simply fix their code by adding
".abc" at the proper places, not needing any more compatibility
layers. Of course, there will be projects that will still have
declared Python 2 support in 2021, but it will arguably not be that
While it's certainly tempting to have "more pure" code in the standard
library, maintaining the compatibility shims for one more release
isn't really that big of a maintenance burden, especially when
comparing with dozens (hundreds?) of third party libraries essentially
maintaining their own.
An good example of a broken package is the nose project which is no
longer maintained (according to their website): the last release was
in 2015. It remains a very popular test runner. According to
libraries.io, it has with 3 million downloads per month, 41.7K
dependent repositories and 325 dependent packages. We patched nose in
Fedora to fix Python 3.5, 3.6, 3.8 and now 3.9 compatibility issues.
People installing nose from PyPI with "pip install" get the
unmaintained flavor which is currently completely broken on Python
Someone should take over the nose project and maintain it again, or
every single project using nose should pick another tool (unittest,
nose2, pytest, whatever else). Both options will take a lot of time.
== Request to revert some incompatible changes ==
Incompatible changes which require "if <python3>: (...) else: (...)"
or "try: <python3 code> except (...): <python2 code>":
* Removed tostring/fromstring methods in array.array and base64 modules
* Removed collections aliases to ABC classes
* Removed fractions.gcd() function (which is similar to math.gcd())
* Remove "U" mode of open(): having to use io.open() just for Python 2
makes the code uglier
* Removed old plistlib API: 2.7 doesn't have the new API
== Kept incompatible changes ==
Ok-ish incompatible changes (mostly only affects compatiblity
libraries like six):
* _dummy_thread and dummy_threading modules removed: broke six, nine
and future projects. six and nine are already fixed for 3.9.
OK incompatible changes (can be replaced by the same code on 2.7 and 3.9):
* isAlive() method of threading.Thread has been removed:
Thread.is_alive() method is available in 2.7 and 3.9
* xml.etree.ElementTree.getchildren() and
xml.etree.ElementTree.getiterator() methods are removed from 3.9, but
list()/iter() works in 2.7 and 3.9
== Call to check for DeprecationWarning in your own projects ==
You must pay attention to DeprecationWarning in Python 3.9: it will be
the last "compatibility layer" release, incompatible changes will be
reapplied to Python 3.10.
For example, you can use the development mode to see
DeprecationWarning and ResourceWarning: use the "-X dev" command line
option or set the PYTHONDEVMODE=1 environment variable. Or you can use
the PYTHONWARNINGS=default environment variable to see
You might even want to treat all warnings as errors to ensure that you
don't miss any when you run your test suite in your CI. You can use
PYTHONWARNINGS=error, and combine it with PYTHONDEVMODE=1.
Warnings filters can be used to ignore warnings in third party code,
see the documentation:
-- Victor Stinner and Miro Hrončok for Fedora
First of all I want to say that I'm very much in favour of the general
idea behind PEP 558. Defined semantics is always better than undefined
However, I think there are a few changes needed:
1. Don't add anything to the C API, please.
Frame attributes can be accessed via `PyObject_GetAttr[String]`.
2. Don't make the behaviour dependent on whether "tracing" is turned on.
Doing so forces debuggers to use sys.settrace which is horribly
slow. It also makes the implementation more complex, and has no benefit
3. Don't store write-through proxies in the frame, but make proxies
retain a reference to the frame. This would reduce the size and
complexity of code for handling frames. Clean up of the frame would
occur naturally via reference count when all proxies have been reclaimed.
The proposed implementation is hard to reason about and I am not
confident that it will not introduce some new subtle bugs to replace the
ones it seeks to remove.
Any implementation that has functions with "Borrow" and "BreakCycles" in
their names makes me nervous.
A simpler conceptual model, which I believe could be made reliable,
No change for non-function frames (as PEP 558 currently proposes).
Each access to `frame.f_locals` (for function frames) should return a
new proxy object.
This proxy object would have dict-like, write-through semantics for
variables in the frame. For keys that do not match variable names, an
exception would be raised. This means that all proxies for a single
frame will have value equivalence; object equivalence is not needed.
I.e. for a frame `f`, `f.f_locals == f.f_locals` would be True, even
though `f.f_locals is f.f_locals` would be False.
Hi! I'm forwarding this on behalf of Marina Moore https://github.com/mnm678 .
- Sumana Harihareswara
PEP 458 ( https://www.python.org/dev/peps/pep-0458/ ) proposes using The Update Framework (TUF) to allow users of PyPI to verify that the packages they install originate from PyPI. Implementing this PEP would provide protection in the event of an attack on PyPI, its mirrors, or the network used to install packages.
We started this PEP in 2013, and have recently revised it and restarted discussion.
Recent discussion and revision of the PEP has been taking place on Discourse ( https://discuss.python.org/t/pep-458-secure-pypi-downloads-with-package-sig… ).
The PEP is ready for review and I look forward to your feedback!
PEP 458 coauthor
It's finally time to schedule the last releases in Python 2's life. There will be two more releases of Python 2.7: Python 2.7.17 and Python 2.7.18.
Python 2.7.17 release candidate 1 will happen on October 5th followed by the final release on October 19th.
I'm going to time Python 2.7.18 to coincide with PyCon 2020 in April, so attendees can enjoy some collective catharsis. We'll still say January 1st is the official EOL date.
Thanks to Sumana Harihareswara, there's now a FAQ about the Python 2 sunset on the website: https://www.python.org/doc/sunset-python-2/
(Apologies. Not sure where to ask this, and I'm not much of a C++
programmer. Maybe I should have just added a comment to the still-open
I just noticed that Nick migrated the guts of Include/frameobject.h to
include/cpython/frameobject.h. It's not clear to me that the latter
should be #include'd directly from anywhere other than
Include/frameobject.h. If that's the case, does the extern "C" stuff
still need to be replicated in the lower level file? Won't the scope
of the extern "C" block in Include/frameobject.h be active at the
Whatever the correct answer is, I suspect the same constraints should
apply to all Include/cpython/*.h files.
I may be thinking in the wrong direction, but right now
`PyType_Type.tp_new` will resolves the `metaclass` from the bases and
type = (PyTypeObject *)metatype->tp_alloc(metatype, nslots);
where `metatype` is actually resolved from the metatype of the bases.
In contrast `PyType_FromSpecWithBases` immediately calls:
res = (PyHeapTypeObject*)PyType_GenericAlloc(&PyType_Type, 0);
So I am curious whether `PyType_FromSpecWithBases` should not do the
same thing, or as to why it does not?
I would also assume that it actually fails to inherit a possible
MetaClass completely, but I have not checked that.
That is the first question. The second, more important one, is whether
ExtensionMetaClasses are a thing at all? Is it reasonable to explore
them, or should I rather give up and use something more like ABCMeta,
which stores information in the `HeapType->tp_dict`, plus tag on
information to the actual instances as needed?
Right now it seems completely fine to me, except that the creation
itself is complicated (and a confusing, but its MetaClasses...).
I am exploring this for NumPy. PySIDE is using such things and wrangled
it into the limited API  (PySIDE needs to store a QT pointer
additionally). When I looked the code, I was fairly sure that this only
happened to work because Python allocates a slightly larger space
(effectively making it `nslots+1`, or 1 above).
As far as I can see, the only thing that happens if I use such an
ExtensionMetaClass is that I have a HeapType with a different
tp_basicsize. And I do not see why that should be any different than
a normal Python object.
The Steering Council has considered Guido’s request for pronouncement on PEP 613 - Explicit Type Aliases, written by Shannon Zhu and sponsored by Guido van Rossum.
The Council unanimously accepts PEP 613. Congratulations!
We’ll leave it to the sponsor and author to update the PEP accordingly.
-Barry (on behalf of the Python Steering Council)