Two researchers studied PyPI's sustainability to understand problems
that affect FOSS infrastructure sustainability in general. On Thursday
20 Aug (2:30 - 3:45 PM EDT), one of the study's authors will speak in a
Ford Foundation event about FOSS sustainability:
You can read the researchers' report at
. Key section of the summary:
> These results suggest that different strategies are needed for addressing the non-technical capacities of digital FOSS infrastructure projects. While capacity for things such as documentation, outreach, project management, design, legal work, etc. are often acknowledged as needs within FOSS communities at-large, they are rarely addressed proactively. Rather, they are often addressed only when a project is in crisis.
I also draw your attention to the table of tensions between "FOSS
culture" and "Infrastructure culture" on page 7, and the misperception
that capacity-building may seem to be slowing a project down (p 6-7).
(Disclosure: I was a consultant on this analysis.)
pip install package
often results in compiling (using gcc, g++, whatever) to produce a
binary. Usually that proceeds without issue. However, there seems to
be no checking that the libraries required to link that binary are
already on the system. Or at least the message which results when
they are not is not at all clear about what is missing.
I discovered that today by wasting several hours figuring out why
scanpy-scripts was failing trying to build dependency "louvain", which
would not install into a venv with pip. It had something to do with
"igraph", but pip had downloaded python-igraph before it got to
louvain. When louvain tried to build there was a mysterious message
about pkgconfig and igraph
Cannot find the C core of igraph on this system using pkg-config.
(Note that when python-igraph installs it places an igraph directory
in site-packages, so which it is referring to is fairly ambiguous.)
Then it tried to install a different version number of igraph, failed,
and the install failed. This was very confusing because the second
igraph install was not (it turned out) a different version of
python-igraph but a system level igraph library, which it could not
install either because the process was not privileged and could not
write to the target directories. Yet it tried to install anyway.
This is discussed in the louvain documentation here (it turns out):
but since I was actually trying to install a different package, of
course I had not read the louvain documentation.
In short form the problem was "cannot build a binary because required
library libigraph.so is not present in the operating system" but that
was less than obvious in the barrage of warnings and error messages.
Is it possible to tell pip or setup.py to fail immediately when a
required system library like this is not found, here presumably after
that "C core" message, rather than confusing the matter further with
a failed partial build and install of the same component?
More generally, is there anything in the python installation methods
which could list system libraries as dependencies and give a more
informative error message when they are missing?
Because of covid-19 I am working a lot from home so my network is likely less robust.
I was wondering about what happens if I upload with twine to pypi and halfway through my network fails.
Is the uploaded filename remembered so I cannot retry when my network comes back?
Or is the upload somehow atomic so that it has to complete(be checked) before the file is entered into pypi?
TL;DR: OK to archive this mailing list? Reply by Aug 30th.
Below: Context, Proposal, Reasoning, and timeline.
For multiple years now, we've been advising users to use setuptools and
to move away from using distutils. There has been a long standing
plan [^1] to vendor distutils into setuptools to allow it to evolve
independently of the Python standard library and to allow for removal of
distutils from the Python standard library.
Recently, setuptools adopted the distutils [^2] which, as far as I can
tell, starts the long process of removing distutils from the Python
standard library. PEP 517 [^3] also removed the special status of
distutils/setuptools as the only pipeline that can be used for
generating distributions for Python projects.
For some time now, "distutils" has not been a "primary" tool in the
broader Python Packaging ecosystem, with setuptools being an overall
superior tool that also has a good interoperability story with the other
Python packaging tooling. This is acknowledged in the description of
this mailing list as:
> Now, it's better described as the "packaging interoperability SIG",
> where issues that cut across different parts of the Python packaging
> ecosystem get discussed and resolved.
However, this mailing list is no longer serving this stated role either,
with the Packaging category on discuss.python.org becoming the primary
location for packaging tool interoperability discussions.
Over the last year, the Packaging category on discuss.python.org had 841
active topics, with only 40 topics with 3 or fewer responses. [^5]
In the last 100 days, the Packaging category on discuss.python.org has
had 91 active topics. More than 10 PEPs have been discussed in the
Packaging category on discuss.python.org in the last 100 days.
Over the last year, distutils-sig had ~109 active threads, with
(based on a quick skim) most having 3 or fewer responses/posters. [^4]
In the last 100 days, distutils-sig has had 32 active threads (at least
7 of these have the same subject as another thread with Re:/Fwd: added).
There has been only 1 PEP-related feedback discussion on distutils-sig
in the last year. Most of the other threads are user support requests or
I suggest that, one month from now, we stop posting to this list
(distutils-sig(a)python.org) and archive it.
I think we do not use this mailing list for its dedicated purpose, and
it does not serve any secondary function that isn't better served by a
different communication channel already.
(1) this mailing list is no longer the primary location for
(2) we have better channels for user support requests (such as
issue trackers of various projects, packaging-problems etc.)
(3) we have other channels for making announcements (such as the
pypi-announce mailing list, the PyPA twitter account,
Here's what I suggest, and what I will carry out if there is no
In one month, on August 30th, I would verify that no one has argued here
for why this mailing list should not be closed/archived.
Or, even if a few people have objected to closing the list, I would
check for rough consensus, especially of people who are doing SOMETHING
productive having to do with Python Packaging (such as maintainers of
Python packages, maintainers of Python Packaging tooling, folks running
key infrastructure, etc.).
Then, I would request the list administrators to post a final message
to this list (marking its close and suggesting that people use
discuss.python.org instead) and, to archive this mailing list. This
would leave archives available at their current URLs, so links, browsing
and search would work.
And finally, I would look through relevant documentation within PyPA
repositories to see what needs updating (READMEs and so on pointing to
the old list), and submit pull requests.
I appreciate the work folks here have done to carry forward Python
packaging over the past 21 years of this mailing list. I don't mean to
diminish that or to insult anyone here. I want to help us out, and I
think closing this list will help focus our energy better. But I am
open to hearing that I am wrong.
packaging psycopg3, I'm wondering what is the best way to provide an
optional optimisation module.
1) provide a separate psycopg3-c distribution
2) provide an extra psycopg3[c]
3) try building the extension and fail quietly.
1) seems the cleanest approach: the psycopg3 distribution would have
no build-time external dependency (Cython, -dev packages, a compiler)
and psycopg3-c can fail hard if some of these dependencies are
missing. I am currently trying this approach, finding some problems in
working out a good files layout to have two setup.py in the same git
2) would be nice but I don't see a way to identify the extra requested
at build time to implement a build_ext command such that if "c" is not
the extra then don't do anything. it seems that extra are thought for
a different use case, not for optional build-time parts
3) would give me endless headaches to work out why something failed,
differentiate failures for missing dependencies from real errors, and
dealing with user reports.
What would be your advice? Press on with 1 or a different approach?
Examples are welcome (the only one I have in mind is PyYAML doing 3).
On behalf of the PyPA, I am pleased to announce that we have just released
pip 20.2, a new version of pip. You can install it by running python -m pip
install --upgrade pip.
The highlights for this release are:
- The beta of the next-generation dependency resolver is available
- Faster installations from wheel files
- Improved handling of wheels containing non-ASCII file contents
- Faster pip list using parallelized network operations
- Installed packages now contain metadata about whether they were
requested by the user (PEP 376’s REQUESTED file)
The new dependency resolver is *off by default* because it is *not yet
ready for everyday use*. The new dependency resolver is significantly
stricter and more consistent when it receives incompatible instructions,
and reduces support for certain kinds of constraints files, so some
workarounds and workflows may break. Please test it with the
--use-feature=2020-resolver flag. Please see our guide on how to test and
migrate, and how to report issues
We are preparing to change the default dependency resolution behavior and
make the new resolver the default in pip 20.3 (in October 2020).
This release also partially optimizes pip’s network usage during
installation (as part of a Google Summer of Code project by McSinyx
test it with pip install --use-feature=fast-deps ... and report bugs to the
functionality is *still experimental* and *not ready for everyday use*.
You can find more details (including deprecations and removals) in the
As with all pip releases, a significant amount of the work was contributed
by pip’s user community. Huge thanks to all who have contributed, whether
through code, documentation, issue reports and/or discussion. Your help
keeps pip improving, and is hugely appreciated.
Specific thanks go to Mozilla (through its Mozilla Open Source Support
<https://www.mozilla.org/en-US/moss/> Awards) and to the Chan Zuckerberg
Initiative <https://chanzuckerberg.com/eoss/> DAF, an advised fund of
Silicon Valley Community Foundation, for their funding that enabled
substantial work on the new resolver.