As a new Twine maintainer I've been running into questions like:
* Now that Warehouse doesn't use "register" anymore, can we deprecate it from distutils, setuptools, and twine? Are any other package indexes or upload tools using it? https://github.com/pypa/twine/issues/311
* It would be nice if Twine could depend on a package index providing an HTTP 201 response in response to a successful upload, and fail on 200 (a response some non-package-index servers will give to an arbitrary POST request).
I do not see specifications to guide me here, e.g., in the official guidance on hosting one's own package index https://packaging.python.org/guides/hosting-your-own-index/ . PEP 301 was long enough ago that it's due an update, and PEP 503 only concerns browsing and download, not upload.
I suggest that I write a PEP specifying an API for uploading to a Python package index. This PEP would partially supersede PEP 301 and would document the Warehouse reference implementation. I would write it in collaboration with the Warehouse maintainers who will develop the reference implementation per pypa/warehouse/issues/284 and maybe add a header referring to compliance with this new standard. And I would consult with the maintainers of packaging and distribution tools such as zest.releaser, flit, poetry, devpi, pypiserver, etc.
Per Nick Coghlan's formulation, my specific goal here would be close to:
> Documenting what the current upload API between twine & warehouse actually is, similar to the way PEP 503 focused on describing the status quo, without making any changes to it. That way, other servers (like devpi) and other upload clients have the info they need to help ensure interoperability.
Since Warehouse is trying to redo its various APIs in the next several months, I think it might be more useful to document and work with the new upload API, but I'm open to feedback on this.
After a little conversation here on distutils-sig, I believe my steps would be:
1. start a very early PEP draft with lots of To Be Determined blanks, submit as a PR to the python/peps repo, and share it with distutils-sig
2. ping maintainers of related tools
3. discuss with others at the packaging sprints https://wiki.python.org/psf/PackagingSprints next week
4. revise and get consensus, preferably mostly on this list
5. finalize PEP and get PEP accepted by BDFL-Delegate
6. coordinate with PyPA, maintainers of `distutils`, maintainers of packaging and distribution tools, and documentation maintainers to implement PEP compliance
Thoughts are welcome. I originally posted this at https://github.com/pypa/packaging-problems/issues/128 .
I am fairly sure if you give the PyPA that suggestion, they will just deflate at the thought of the workload. Besides, we already offer private repos for free, several ways ranging from devpi to python -m SimpleHTTPServer in a specially created directory.
From: Python-ideas <python-ideas-bounces+tritium-list=sdamon.com(a)python.org> On Behalf Of Nick Humrich
Sent: Wednesday, April 4, 2018 12:26 PM
Subject: [Python-ideas] Pypi private repo's
I am sure this has been discussed before, and this might not even be the best place for this discussion, but I just wanted to make sure this has been thought about.
What if pypi.org <http://pypi.org> supported private repos at a cost, similar to npm?
This would be able to help support the cost of pypi, and hopefully make it better/more reliable, thus in turn improving the python community.
If this discussion should happen somewhere else, let me know.
The weekend of October 27-28, simultaneously in London, UK and New York
City, USA, Bloomberg will host a Python packaging and distribution tools
event. Please mark your calendars!
If you live in North America or Europe and would need assistance to
attend this as a mentor/helper, watch for more details in July.
If you live outside of the US or UK and would need an invitation letter
to get a visa to travel to one of these sprints, please write to Kevin
P. Fleming at Bloomberg, kpfleming AT bloomberg DOT net, and he'll start
setting you up.
Thanks to Bloomberg for their generosity. They're already a Platinum PSF
sponsor, and they'll host this, pay for a maintainers'/mentors' dinner
the night before, provide clusters of cloud virtual machines for the
attendees to use, and book and pay for some contributors' lodging and
This'll be an opportunity to advance Python packaging/distro tools,
teach new contributors (including many Bloomberg employees), and yeah,
if you want to get to know Bloomberg for career reasons, that too. :)
We hope mentors can arrive Thursday night 25 Oct, do prep, setup, and
dinner on Friday, then participate Sat-Sun, then leave Sunday evening or
We'll be putting more details on these lists (distutils-sig and
pypa-dev) and at https://wiki.python.org/psf/PackagingSprints .
Thanks to Bloomberg folks Mario Corchero and Henry Kleynhans in London
and Kevin P. Fleming in New York City for coordinating this, and thanks
especially to Mario and to Paul Ganssle for suggesting it!
Today, LWN published my new article "A new package index for Python".
https://lwn.net/Articles/751458/ In it, I discuss security, policy, UX
and developer experience changes in the 15+ years since PyPI's founding,
new features (and deprecated old features) in Warehouse, and future
plans. Plus: screenshots!
If you aren't already an LWN subscriber, you can use this subscriber
link for the next week to read the article despite the LWN paywall.
This summary should help occasional Python programmers -- and frequent
Pythonists who don't follow packaging/distro discussions closely --
understand why a new application is necessary, what's new, what features
are going away, and what to expect in the near future. I also hope it
catches the attention of downstreams that ought to migrate.
Warehouse project manager
In PEP 518, it is not clearly specified how a project that has a
file but has no build-system.requires should be treated (i.e. build-system
In pip 10, such a pyproject.toml file was allowed and built with setuptools
and wheel, which has resulted in a lot of projects making releases that
that such a pyproject.toml file is valid and they use setuptools and wheel.
I understand that at least pytest, towncrier and Twisted might have done so.
This happened since these projects have included configuration for some
pyproject.toml (some of which use only pyproject.toml for configuration --
There's a little bit of subtlety here, in pip 10's implementation: adding a
pyproject.toml file enables a new code path that does the build in isolation
(in preparation for PEP 517; it's a good idea on it's own too) with only the
build-system.requires packages available. When the build-system.requires
is missing, pip falls back to assuming it should be ["setuptools", "wheel"].
The in-development version of pip currently prints warnings when the key is
not specified -- along the lines of "build-system.requires is missing" +
"A future version of pip will reject pyproject.toml files that do not comply
with PEP 518." and falls back to legacy behavior.
Basically, pip 10 has a distinction between a missing pyproject.toml and
build-system.requires = ["setuptools", "wheel"] and the PEP doesn't.
the PEP's precise wording here would help inform the debate about how pip
should behave in this edge case.
I can think of at least 2 options for behavior when build-system.requires is
1. Consider a missing build-system.requires equivalent to either a missing
pyproject.toml or build-system.requires = ["setuptools", "wheel"].
2. Making the build-system table mandatory in pyproject.toml.
I personally think (2) would be fine -- "Explicit is better than implicit."
It'll be easy to detect and error out in this case, in a way that it's
to provide meaningful information to the user about what to do here.
this does mean that some existing releases of projects become
which is concerning; I do think the benefits outweigh the costs though.
Thoughts on this?
I had a thought for something that might be a simple way to improve
dev experience with custom build backends.
A PEP 517 build backend is a Python object that has some special
methods on it. And the way a project picks which object to use, is via
build-backend = "module1.module2:object"
Currently, this means that the build backend is the Python object:
Here's my idea: what if change it, so that the above config is
interpreted as meaning that the build backend is the Python object:
(I.e., we tack a "__build_backend__" on the end before looking it up.)
Why does this matter? Well, with the current system, if you want to
use flit  as your build backend, you have to write:
build-backend = "flit.buildapi"
And if you want to use intreehooks ,you have to writ:
build-backend = "intreehooks:loader"
These names are slightly awkward, because these projects don't want to
just jam all the PEP 517 methods directly onto the top-level module
object, so they each have to invent some ad hoc sub-object to put the
methods on. And then that's exposed to all their users as a bit of
random cruft you have to copy-paste.
The idea of __build_backend__ is that these projects could rename the
'buildapi' and 'loader' objects to be '__build_backend__' instead, and
then users could write:
build-backend = "flit"
build-backend = "intreehooks"
build-backend = "setuptools"
and it just feels nicer.
Right now PEP 517 is still marked provisional, and pip hasn't shipped
support yet, so I think changing this is still pretty easy. (It would
mean a small amount of work for projects like flit that have already
What do you think? (Thomas, I'd love your thoughts in particular :-).)
Nathaniel J. Smith -- https://vorpus.org
I recently stumbled into a worrying problem with pip. I found out that
doing "pip install pusher requests" installs urllib3 v1.23 as a
dependency even though requests specifically restricts the version to
lower than 1.23. Then if instead I do "pip install requests pusher" it
installs urllib3 v1.22 as expected. As I recall, pip has long had a
problem with combining version specifiers and extras when the same
target has been required from multiple sources. What I wanted to ask
was, is this a simple bug, or a larger unresolved design problem?
Should pip also take into consideration the requirements from existing
installed packages so pip won't end up installing upgrades they're
Are custom installation commands in setup.py no longer respected by setuptools? For example, the pybind11 project has a custom InstallHeaders command class in its setup.py, which is passed to the setup() call.
When setup is imported from setuptools, the custom command class never gets invoked. When setup is imported from distutils.core, the custom command class is invoked.
What's the reason for the disparity - can someone please enlighten me?
I just uploaded python-gnupg 0.4.3 to PyPI using Twine. Search still shows the previous version:
https://pypi.org/search/?q=python-gnupg => 0.4.2
However, clicking on the link brings up the page for the latest version:
https://pypi.org/project/python-gnupg/ => 0.4.3
But pip install is also wrongly picking up 0.4.2. What's the expected delay between uploading a new version and having it be available via pip? I would have expected more or less immediately. All systems are showing as operational.