[Python-Dev] Preserving the definition order of class namespaces.

Nick Coghlan ncoghlan at gmail.com
Wed May 27 14:40:53 CEST 2015

On 27 May 2015 at 19:02, Antoine Pitrou <solipsis at pitrou.net> wrote:
> At some point, we should recognize our pain is more important than
> others' when it comes to the fitness of *our* community. I don't see
> those other people caring about our pain, and proposing e.g. to offload
> some of the maintenance burden (for example the 2.7 LTS maintenance
> burden, the maintenance of security branches).

Sure we care, corporate politics and business cases are just complex
beasts to wrangle when it comes to justifying funding of upstream
contributions. The main problem I personally have pushing for
increased direct upstream investment at the moment is that we're still
trying to get folks off Python *2.6*, provide a smoother transition
plan for the Python 2.7 network security fixes, and similarly get
ready to help customers handle the Python 3 migration, so it's hard
for me to make the case that upstream maintenance is the task most in
need of immediate investment. (We also don't have a well established
culture of Python users reporting Python bugs to the commercial
redistributor providing their copy of Python, which breaks one of the
key metrics we rely on as redistributors for getting upstream
contributions funded appropriately funded: "only files bugs and
feature requests directly with the upstream project" and "doesn't use
the project at all" unfortunately looks identical from a vendor

Even with those sorts of challenges, Red Hat still covered
implementing the extension module importing improvements in 3.5 (as
well as making it possible for me to take the time for reviewing the
async/await changes, amongst other things), and HP have now taken over
from Rackspace as the primary funder of pypi.python.org development
(and a lot of PyPA development as well). The Red Hat sponsored CentOS
QA infrastructure is also the back end powering the pre-merge patch
testing Kushal set up. Red Hat and Canonical have also been major
drivers of Python 3 porting efforts for various projects as they've
aimed to migrate both Ubuntu and Fedora to using Python 3 as the
system Python.

Longer term, the best way I know of to get corporations to pick up the
tab for things like 2.7 maintenance is to measure it and publish the
results (as well as better publicising who is releasing products that
depend on it). Paying customers get nervous when open source
foundations are publishing contribution metrics that call into
question a vendor's ability to support their software (even if the
relevant foundation is too polite to point it out explicitly
themselves, just making this kind of data available means that
competing vendors inevitably use it to snipe at each other).

The OpenStack Foundation's stackalytics.openstack.com is a master
class in doing this well, with the Linux Foundation's annual kernel
development report a close second, so I'd actually like to get the PSF
to fund regular contribution metrics for CPython. Setting that up
unfortunately hasn't made it to the top of my todo list yet
(obviously, given it hasn't been done), but I'm cautiously optimistic
about being able to get to it at some point this year.

> For some reason it
> sounds like we should be altruistic towards people who are not :-)

Nope, if we're doing things for the benefit of corporations, we should
demand they pay for it if they want it done (or at least be happy that
the value of the free software, services and contributions they're
making available to the community are sufficient to earn them a fair

However, even if we personally happen to be sceptical of the
contributions particular corporations have made to the open source
community (or choose to discount any contributions that consist
specifically of bug fix commits to the CPython repo), we don't get to
ignore their *users* so cavalierly, and we've ended up in a situation
where the majority of Python users are 5-7 years behind the progress
of core development at this point.

Donald's PyPI statistics
(https://caremad.io/2015/04/a-year-of-pypi-downloads/) suggest to me
that Linux distributions may need to shoulder a lot of the blame for
that, and assuming I'm right about that, fixing it involves changing
the way Linux distributions work in general, rather than being a
Python-specific problem (hence proposals like
and new ways of approaching building Linux distributions, like CoreOS
and Project Atomic).

However, the fact that 2.6 currently still has a larger share of PyPI
downloads than Python 3, and that 2.7 is still by far the single most
popular version is a problem that we need to be worried about upstream
as well.

That doesn't point towards "we should make the entire standard library
move faster" for me: it points me towards leveraging PyPI to further
decouple the rate of evolution of the ecosystem from the rate of
evolution of the core interpreter and standard library. If attribute
level granularity sounds too complex (and I agree that such a scheme
would be awkward to work with), then it would also likely be feasible
to expand on the "bundled package" approach that was pioneered with
ensurepip, and have additional projects provided by default with
CPython, but have them be truly independently versioned using the
standard Python package management tools (this might even be an
appropriate approach to consider for some existing standard library
modules, such as ssl, unittest, idlelib or distutils).


[1] https://help.openshift.com/hc/en-us/articles/202399790-How-to-request-resources-for-Non-Profit-Open-Source-or-Educational-Institutions

Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

More information about the Python-Dev mailing list