PyPi does not allow duplicate file names -- this makes lots of sense,
because you really don't want people to go to PyPi one day and grab a file,
and then go there another day, grab a file with exactly the same name, and
have it be a different file.
We are all too human, and make mistakes when doing a release. All to often
someone pushed a broken file up to PyPi, often realizes it pretty quickly
-- before anyone has a chance to even download it (or only the dev team as,
In fact, I was in a sprint last summer, and we decided to push our package
up to PyPi -- granted, we were all careless amateurish noobs, but we ended
up making I think 4! minor version bumps because we had done something
stupid in the sdist.
Also -- the latest numpy release got caught in this, too:
* We ran into a problem with pipy not allowing reuse of filenames and a
resulting proliferation of *.*.*.postN releases. Not only were the names
getting out of hand, some packages were unable to work with the postN
So -- I propose that PyPi allow projects to replace existing files if they
REALLY REALLY want to.
You should have to jump through all sorts of hoops, and make it really
clear that it is a BAD IDEA in the general case, but it'd be good to have
it be possible.
After all -- PyPi does not take on responsibility for anything else about
what's in those packages, and Python itself is all about "we're all
consenting adults here"
I suppose we could even put in some heuristics about how long the file as
been there, how many times it's been downloaded, etc.
Just a thought.....I really hate systems that don't let me roll back
mistakes, even when I discover them almost immediately...
Christopher Barker, Ph.D.
Emergency Response Division
NOAA/NOS/OR&R (206) 526-6959 voice
7600 Sand Point Way NE (206) 526-6329 fax
Seattle, WA 98115 (206) 526-6317 main reception
I've got two projects: mynamespace.myprojectA and mynamespace.myprojectB
myprojectB depends on myprojectA. I'm using setuptools 0.6c8 to manage both
Both projects are registered using 'setup develop'. Both projects are
accessible from an interactive interpreter:
PS C:\Users\me\projects> python
Python 2.5.2 (r252:60911, Feb 21 2008, 13:11:45) [MSC v.1310 32 bit (Intel)]
Type "help", "copyright", "credits" or "license" for more information.
>>> import mynamespace.myprojectA
>>> import mynamespace.myprojectB
>>> from mynamespace.myprojectA import mymoduleZ
However, when I run 'setup test' in myprojectB, the tests fail with
File ".mymoduleZ.py", line NNN, in [some context]
from mynamespace.myprojectA.mymoduleZ import MyClassC
ImportError: No module named myprojectA.mymoduleZ
In setup.py, the test_suite is nose.collector.
I searched and couldn't find anyone else with this problem. Is this a
supported configuration? Is there something I can do to make tests work
with interdependent projects with the same root namespace?
If there's not something obvious I should be doing differently, I'm happy to
put together a minimal test case that reproduces the problem. Any
suggestions are appreciated.
Jason R. Coombs
I am a research programmer at the NYU School of Engineering. My colleagues
(Trishank Kuppusamy and Justin Cappos) and I are requesting community
feedback on our proposal, "Surviving a Compromise of PyPI." The two-stage
proposal can be reviewed online at:
Summary of the Proposal:
"Surviving a Compromise of PyPI" proposes how the Python Package Index
(PyPI) can be amended to better protect end users from altered or malicious
packages, and to minimize the extent of PyPI compromises against affected
users. The proposed integration allows package managers such as pip to be
more secure against various types of security attacks on PyPI and defend
end users from attackers responding to package requests. Specifically,
these PEPs describe how PyPI processes should be adapted to generate and
incorporate repository metadata, which are signed text files that describe
the packages and metadata available on PyPI. Package managers request
(along with the packages) the metadata on PyPI to verify the authenticity
of packages before they are installed. The changes to PyPI and tools will
be minimal by leveraging a library, The Update Framework
<https://github.com/theupdateframework/tuf>, that generates and
transparently validates the relevant metadata.
The first stage of the proposal (PEP 458
<http://legacy.python.org/dev/peps/pep-0458/>) uses a basic security model
that supports verification of PyPI packages signed with cryptographic keys
stored on PyPI, requires no action from developers and end users, and
protects against malicious CDNs and public mirrors. To support continuous
delivery of uploaded packages, PyPI administrators sign for uploaded
packages with an online key stored on PyPI infrastructure. This level of
security prevents packages from being accidentally or deliberately tampered
with by a mirror or a CDN because the mirror or CDN will not have any of
the keys required to sign for projects.
The second stage of the proposal (PEP 480
<http://legacy.python.org/dev/peps/pep-0480/>) is an extension to the basic
security model (discussed in PEP 458) that supports end-to-end verification
of signed packages. End-to-end signing allows both PyPI and developers to
sign for the packages that are downloaded by end users. If the PyPI
infrastructure were to be compromised, attackers would be unable to serve
malicious versions of these packages without access to the project's
developer key. As in PEP 458, no additional action is required by end
users. However, PyPI administrators will need to periodically (perhaps
every few months) sign metadata with an offline key. PEP 480 also proposes
an easy-to-use key management solution for developers, how to interface
with a potential build farm on PyPI infrastructure, and discusses the
security benefits of end-to-end signing. The second stage of the proposal
simultaneously supports real-time project registration and developer
signatures, and when configured to maximize security on PyPI, less than 1%
of end users will be at risk even if an attacker controls PyPI and goes
undetected for a month.
We thank Nick Coghlan and Donald Stufft for their valuable contributions,
and Giovanni Bajo and Anatoly Techtonik for their feedback.
PEP 458 & 480 authors.
As a new Twine maintainer I've been running into questions like:
* Now that Warehouse doesn't use "register" anymore, can we deprecate it from distutils, setuptools, and twine? Are any other package indexes or upload tools using it? https://github.com/pypa/twine/issues/311
* It would be nice if Twine could depend on a package index providing an HTTP 201 response in response to a successful upload, and fail on 200 (a response some non-package-index servers will give to an arbitrary POST request).
I do not see specifications to guide me here, e.g., in the official guidance on hosting one's own package index https://packaging.python.org/guides/hosting-your-own-index/ . PEP 301 was long enough ago that it's due an update, and PEP 503 only concerns browsing and download, not upload.
I suggest that I write a PEP specifying an API for uploading to a Python package index. This PEP would partially supersede PEP 301 and would document the Warehouse reference implementation. I would write it in collaboration with the Warehouse maintainers who will develop the reference implementation per pypa/warehouse/issues/284 and maybe add a header referring to compliance with this new standard. And I would consult with the maintainers of packaging and distribution tools such as zest.releaser, flit, poetry, devpi, pypiserver, etc.
Per Nick Coghlan's formulation, my specific goal here would be close to:
> Documenting what the current upload API between twine & warehouse actually is, similar to the way PEP 503 focused on describing the status quo, without making any changes to it. That way, other servers (like devpi) and other upload clients have the info they need to help ensure interoperability.
Since Warehouse is trying to redo its various APIs in the next several months, I think it might be more useful to document and work with the new upload API, but I'm open to feedback on this.
After a little conversation here on distutils-sig, I believe my steps would be:
1. start a very early PEP draft with lots of To Be Determined blanks, submit as a PR to the python/peps repo, and share it with distutils-sig
2. ping maintainers of related tools
3. discuss with others at the packaging sprints https://wiki.python.org/psf/PackagingSprints next week
4. revise and get consensus, preferably mostly on this list
5. finalize PEP and get PEP accepted by BDFL-Delegate
6. coordinate with PyPA, maintainers of `distutils`, maintainers of packaging and distribution tools, and documentation maintainers to implement PEP compliance
Thoughts are welcome. I originally posted this at https://github.com/pypa/packaging-problems/issues/128 .
I am fairly sure if you give the PyPA that suggestion, they will just deflate at the thought of the workload. Besides, we already offer private repos for free, several ways ranging from devpi to python -m SimpleHTTPServer in a specially created directory.
From: Python-ideas <python-ideas-bounces+tritium-list=sdamon.com(a)python.org> On Behalf Of Nick Humrich
Sent: Wednesday, April 4, 2018 12:26 PM
Subject: [Python-ideas] Pypi private repo's
I am sure this has been discussed before, and this might not even be the best place for this discussion, but I just wanted to make sure this has been thought about.
What if pypi.org <http://pypi.org> supported private repos at a cost, similar to npm?
This would be able to help support the cost of pypi, and hopefully make it better/more reliable, thus in turn improving the python community.
If this discussion should happen somewhere else, let me know.
The weekend of October 27-28, simultaneously in London, UK and New York
City, USA, Bloomberg will host a Python packaging and distribution tools
event. Please mark your calendars!
If you live in North America or Europe and would need assistance to
attend this as a mentor/helper, watch for more details in July.
If you live outside of the US or UK and would need an invitation letter
to get a visa to travel to one of these sprints, please write to Kevin
P. Fleming at Bloomberg, kpfleming AT bloomberg DOT net, and he'll start
setting you up.
Thanks to Bloomberg for their generosity. They're already a Platinum PSF
sponsor, and they'll host this, pay for a maintainers'/mentors' dinner
the night before, provide clusters of cloud virtual machines for the
attendees to use, and book and pay for some contributors' lodging and
This'll be an opportunity to advance Python packaging/distro tools,
teach new contributors (including many Bloomberg employees), and yeah,
if you want to get to know Bloomberg for career reasons, that too. :)
We hope mentors can arrive Thursday night 25 Oct, do prep, setup, and
dinner on Friday, then participate Sat-Sun, then leave Sunday evening or
We'll be putting more details on these lists (distutils-sig and
pypa-dev) and at https://wiki.python.org/psf/PackagingSprints .
Thanks to Bloomberg folks Mario Corchero and Henry Kleynhans in London
and Kevin P. Fleming in New York City for coordinating this, and thanks
especially to Mario and to Paul Ganssle for suggesting it!
Today, LWN published my new article "A new package index for Python".
https://lwn.net/Articles/751458/ In it, I discuss security, policy, UX
and developer experience changes in the 15+ years since PyPI's founding,
new features (and deprecated old features) in Warehouse, and future
plans. Plus: screenshots!
If you aren't already an LWN subscriber, you can use this subscriber
link for the next week to read the article despite the LWN paywall.
This summary should help occasional Python programmers -- and frequent
Pythonists who don't follow packaging/distro discussions closely --
understand why a new application is necessary, what's new, what features
are going away, and what to expect in the near future. I also hope it
catches the attention of downstreams that ought to migrate.
Warehouse project manager
In PEP 518, it is not clearly specified how a project that has a
file but has no build-system.requires should be treated (i.e. build-system
In pip 10, such a pyproject.toml file was allowed and built with setuptools
and wheel, which has resulted in a lot of projects making releases that
that such a pyproject.toml file is valid and they use setuptools and wheel.
I understand that at least pytest, towncrier and Twisted might have done so.
This happened since these projects have included configuration for some
pyproject.toml (some of which use only pyproject.toml for configuration --
There's a little bit of subtlety here, in pip 10's implementation: adding a
pyproject.toml file enables a new code path that does the build in isolation
(in preparation for PEP 517; it's a good idea on it's own too) with only the
build-system.requires packages available. When the build-system.requires
is missing, pip falls back to assuming it should be ["setuptools", "wheel"].
The in-development version of pip currently prints warnings when the key is
not specified -- along the lines of "build-system.requires is missing" +
"A future version of pip will reject pyproject.toml files that do not comply
with PEP 518." and falls back to legacy behavior.
Basically, pip 10 has a distinction between a missing pyproject.toml and
build-system.requires = ["setuptools", "wheel"] and the PEP doesn't.
the PEP's precise wording here would help inform the debate about how pip
should behave in this edge case.
I can think of at least 2 options for behavior when build-system.requires is
1. Consider a missing build-system.requires equivalent to either a missing
pyproject.toml or build-system.requires = ["setuptools", "wheel"].
2. Making the build-system table mandatory in pyproject.toml.
I personally think (2) would be fine -- "Explicit is better than implicit."
It'll be easy to detect and error out in this case, in a way that it's
to provide meaningful information to the user about what to do here.
this does mean that some existing releases of projects become
which is concerning; I do think the benefits outweigh the costs though.
Thoughts on this?
I had a thought for something that might be a simple way to improve
dev experience with custom build backends.
A PEP 517 build backend is a Python object that has some special
methods on it. And the way a project picks which object to use, is via
build-backend = "module1.module2:object"
Currently, this means that the build backend is the Python object:
Here's my idea: what if change it, so that the above config is
interpreted as meaning that the build backend is the Python object:
(I.e., we tack a "__build_backend__" on the end before looking it up.)
Why does this matter? Well, with the current system, if you want to
use flit  as your build backend, you have to write:
build-backend = "flit.buildapi"
And if you want to use intreehooks ,you have to writ:
build-backend = "intreehooks:loader"
These names are slightly awkward, because these projects don't want to
just jam all the PEP 517 methods directly onto the top-level module
object, so they each have to invent some ad hoc sub-object to put the
methods on. And then that's exposed to all their users as a bit of
random cruft you have to copy-paste.
The idea of __build_backend__ is that these projects could rename the
'buildapi' and 'loader' objects to be '__build_backend__' instead, and
then users could write:
build-backend = "flit"
build-backend = "intreehooks"
build-backend = "setuptools"
and it just feels nicer.
Right now PEP 517 is still marked provisional, and pip hasn't shipped
support yet, so I think changing this is still pretty easy. (It would
mean a small amount of work for projects like flit that have already
What do you think? (Thomas, I'd love your thoughts in particular :-).)
Nathaniel J. Smith -- https://vorpus.org