On behalf of the PyPA, I am pleased to announce that a beta release of pip, pip 20.1b1, has been released.
The highlights for this release are:
* Significant speedups when building local directories, by changing behavior to perform in-place builds, instead of copying to temporary directories.
* Significant speedups in `pip list --outdated`, by parallelizing network access. This is the first instance of parallel code within pip's codebase.
* A new `pip cache` command, which makes it possible to introspect and manage pip's cache directory.
* Better `pip freeze` for packages installed from direct URLs, enabled by the implementation of PEP 610.
We would be grateful for all the testing that users could do to ensure that, when pip 20.1 is released, it's as solid as we can make it. You can upgrade to this beta with `python -m pip install -U --pre pip`.
This release also contains an alpha version of pip's next generation resolver. It is **off by default** because it is **unstable and not ready for everyday use**. If you're curious about this, please visit this GitHub issue about the resolver, what doesn't work yet, and what kind of testing would help us out https://github.com/pypa/pip/issues/8099 .
The full changelog is available. https://pip.pypa.io/en/latest/news/
As with all pip releases, a significant amount of the work was contributed by pip's user community. Huge thanks to all who have contributed, whether through code, documentation, issue reports and/or discussion. Your help keeps pip improving, and is hugely appreciated.
Specific thanks go to Mozilla (through its Mozilla Open Source Support Awards) and to the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundation, for their support that enabled the work on the new resolver.
pip project manager under contract with Python Software Foundation
Hi, devpi folks! I figure you might want to take a look at the PyPI security PEP currently being discussed, since I could imagine devpi wanting to also add TUF metadata handling for packages, and in case there are interoperability concerns/questions.
The PEP authors are revising the proposed summary, title, etc., per https://github.com/secure-systems-lab/peps/blob/c13384a4fac6822626abb7e09ab… :
> Attacks on software repositories are common, even in organizations with very
good security practices__. The resulting repository compromise allows an
attacker to edit all files stored on the repository and sign these files using
any keys stored on the repository (online keys). In many signing schemes (like
TLS), this access allows the attacker to replace files on the repository and
make it look like these files are coming from PyPI. Without a way to revoke and
replace the trusted private key, it is very challenging to recover from a
repository compromise. In addition to the dangers of repository compromise,
software repositories are vulnerable to an attacker on the network (MITM)
intercepting and changing files. These and other attacks on software
repositories are detailed here__. This PEP aims to protect users of PyPI from
compromises of the integrity, consistency and freshness properties of PyPI
packages, and enhances compromise resilience, by mitigating key risk and
providing mechanisms to recover from a compromise of PyPI or its signing keys.
In addition to protecting direct users of PyPI, this PEP aims to provide similar
protection for users of PyPI mirrors.
> To provide compromise resilient protection of PyPI, this PEP proposes the use of
The Update Framework _ (TUF). .....
> This PEP describes changes to the PyPI infrastructure that are needed to ensure
that users get valid packages from PyPI. ...
> __ https://github.com/theupdateframework/pip/wiki/Attacks-on-software-reposito…
> __ https://theupdateframework.github.io/security.html
Discussion should probably be directed to the Discourse thread at discuss.python.org ; this is just a heads-up.
I have a devpi server where we have recently deleted a lot of old package versions, using devpi-client. The data directory now looks like:
7.6 GiB [##########] /.indices
956.5 MiB [# ] .sqlite
461.4 MiB [ ] /+files
Is this expected? Is there any command to run to rebuild/purge the index?
We're starting to roll out our new Devpi installation to projects now that
we've tested it extensively. We have one master and three mirrors. For the
purposes of this email, the mirrors are load-balanced behind the domain
name "devpi.example.com," and the master is available at "
We configured our Pipenv and Pip installations to use this index URL:
That URL will use the mirrors. The "org_name/stable" index extends
root/pypi and also includes our internal projects. Everything appears to be
working on the client. No errors, installs are going off without a hitch.
However, the Devpi server on the master machine (not the mirrors) logs a
ton of errors like the following:
[req78237] [Rtx14602] while handling
The project internal_project_name does not exist.
What is the cause of these errors? Is this something that should concern
us, or something that we can ignore? "internal_project_name" installs just
fine, but for some reason yields those errors on the master server.
I'm having trouble with a reverse-proxy config and could use some
We have the following setup:
Accepts requests via the official hostname and HTTPS (port 443)
Forwards requests to Devpi Nginx server
Sends headers X-Forwaded-Port, X-Forwarded-Host, X-Forwarded-Proto matching
the official hostname, HTTPS, and 443.
DEVPI NGINX SERVER
Accepts requests via port 80
Serves requests for +f files directly (works perfectly)
Forwards remaining requests to Devpi Python server
DOES NOT override headers X-Forwaded-Port, X-Forwarded-Host,
X-Forwarded-Proto (I have those proxy_set_header values from the
recommended nginx.conf commented out so that Nginx doesn't override them)
DOES NOT send header X-Outside-Url (I also have that commented out)
http://localhost/... instead of https://the.correct.domain.name/...
What do I need to change to make Devpi properly use the X-Forwaded-Port,
X-Forwarded-Host, X-Forwarded-Proto headers coming from the load balancer?
I uploaded development releases for testing:
These releases are drop-in updates. For devpi-web you might have to run
``--recreate-search-index`` in case you want to go back to a previous
version. Besides that there are no changes preventing a downgrade. But
it's best to test devpi-web on a replica which can be rebuilt if
Speaking of rebuilding replicas: With the ``--replica-file-search-path``
option and the ``--hard-links`` option it is possible to vastly cut down
the replication time when you already have the files on disk. Either
create the new replica in a new folder, or rename the old server folder
to create the new replica in the same place. Then run the new replica
with ``--replica-file-search-path old_location/+files --hard-links`` and
while replicating the new replica will look for existing files in the
specified location and create hard links if it finds one and it's
As the version number already indicates, devpi-web has the biggest
changes, namely dropping Python 2.7 support, a different way to index
projects and several changes to templates which might require updates in
Deprecations and Removals
- Dropped support for Python 2.7.
- project.pt: implement #319 - add link to latest version and summary of
latest version to project view
- doc.pt: implement #339: when viewing older documentation, show link to
latest and stable documentation.
- implement #346: add support for 'latest' and 'stable' versions for
project version URLs
- version.pt, style.css: implement #347 - show navigation link when
newer version is available
- The search indexing is now performed in a separate thread. Indexing
prioritizes private indexes, then projects downloaded from mirrors and
lastly never downloaded projects from mirrors.
- On server start a refresh run for the search index is started
automatically. In addition to that the indexing is done in smaller
batches, which allows resuming of indexing the large pypi project lists.
- In many cases re-indexing is now avoided.
- version.pt, common.js: all list like metadata is now collapsed to ~3
lines instead of only classifiers.
- The ``--recreate-search-index`` option now just removes the whole
index. Upon server restart the index is rebuilt automatically in the
- With devpi-server 5.1.1 or newer, indexing of mirrors will not trigger
a refresh of the simple links for projects which have downloaded
releases. This allows quicker indexing for new replicas and in case the
search index is rebuilt using ``--recreate-search-index``.
- root.pt: added new view for user
- Pass query string on to documentation iframe, this fixes Sphinx search
among other things.
- macros.pt, version.pt, style.css: moved separators to css.
- The minimum supported version of devpi-server is now 5.1.0.
- The interface of ``update_projects`` for search indexers has changed.
The list of projects is now a list of ``ProjectIndexingInfo`` objects
instead of ``dict``. To get the dictionary as before, call
``preprocess_project`` on the object.
- Metrics for the sqlite storage cache are exposed in the JSON output of
the +status view.
- fix #545: provide proper error message when there is an exception
during push to an external repository.
- Fix possible race condition when writing files.
- Fix possible assertion error if importing multiple changes in a
replica fails in the middle and fetching a single change set is tried as
- For plugins the ``offline`` attribute of mirror stage instances now
works independently of the ``--offline-mode`` command line option. This
only applies to ``_perstage`` methods.
- Files created in a transaction are written directly to temporary files
instead of being kept in memory until commit.
- Unnecessary database writes where the final value didn't change are
- The timeout when fetching the list of remote projects for a mirror
index is set to a minimum of 30s by default and to 60s when running as
replica. Other fetches of mirrors still use the timeout specified via
There currently is an issue about mirror_whitelist that needs input from
anyone who uses it together with inheritance (more than just root/pypi
My organization is exploring replacing our current, patched-together
"wheelhouse" with a proper Devpi server mirroring PyPi and hosting our
internal packages. Because this Devpi architecture will be accessed by
hundreds of developers and over 100 build server workers, it needs to be
highly available (able to handle a lot of traffic). So this leads me to
several questions (five major questions and a few sub-questions, to be more
- Is anyone familiar with a threshold of diminishing returns when it comes
to the "threads" server setting? At what point, if any, does increasing
this number do no good? I assume that # of "threads" roughly equates to #
of simultaneous connections that single server is capable of, perhaps minus
one or two. Is this a fair assumption?
- How many "devpi-server" instances can access a single "serverdir?" Is it
just one, and bad things will happen if a second "devpi-server" tries to
access it (and, if I need multiple servers, I should use master-replica
replication)? Or can multiple "devpi-server" instances access a single
"serverdir" to serve as multiple hot masters?
- When employing master-replica replication, I'm wondering if the following
configuration is sensible, or if there are better approaches: 1) One (or
possibly more, if allowed) master, 2) Two or more replicas replicating off
the master, 3) An HTTPS load balancer that sends all traffic to the
replicas (not the master? or include the master in the load balancing?). In
this case, what happens if someone uses the https://load-balanced-devpi-url/
URL to publish a new package? Does a replica it lands on just forward that
up to the master? Or does the user get an error? Or do worse things happen?
- When employing master-replica replication, I noticed in the documentation
this word of caution:
"You probably want to make sure that before starting the replica and/or
master that you also have the Web interface and search plugin installed.
You cannot install it later."
To be clear, I already have "devpi-web" installed (and a theme, too), but
this note confuses (and slightly concerns) me, so I'd like to understand it
better. Why would you be unable to install "devpi-web" after starting a
master or replica? What would it break? Why wouldn't this work? Or do you
just mean that you would have to shut down the servers to install
"devpi-web" (which is understandable)? Is this different from
non-replication environments (can you install "devpi-web" after starting a
non-replicating server), and if so, why? I'm asking for details about this
because I'm curious if it has potential consequences (stability, etc.) that
extend beyond just the "devpi-web" plugin.
- When creating a new index, could someone elaborate a bit more on the
"mirror_whitelist" setting? What is the difference between not setting it
at all (default/implicit) and explicitly setting it to "*"? What does "*"
actually mean? When you upload a new package to the index that conflicts
with a package in its base index, is the resulting behavior that the
uploaded package (or perhaps just the versions you upload) ALWAYS overrides
the one in the base index, regardless of the "mirror_whitelist" setting? My
goal here is to have an index that uses "root/pypi" as its base and also
hosts our internal packages, so that we can point Pipenv to one URL and
install everything from there. Is there a more sensible approach than that
which I am planning on taking?
Thanks in advance for helping me make the right decisions in our upcoming