I uploaded development releases for testing:
These releases are drop-in updates. For devpi-web you might have to run
``--recreate-search-index`` in case you want to go back to a previous
version. Besides that there are no changes preventing a downgrade. But
it's best to test devpi-web on a replica which can be rebuilt if
Speaking of rebuilding replicas: With the ``--replica-file-search-path``
option and the ``--hard-links`` option it is possible to vastly cut down
the replication time when you already have the files on disk. Either
create the new replica in a new folder, or rename the old server folder
to create the new replica in the same place. Then run the new replica
with ``--replica-file-search-path old_location/+files --hard-links`` and
while replicating the new replica will look for existing files in the
specified location and create hard links if it finds one and it's
As the version number already indicates, devpi-web has the biggest
changes, namely dropping Python 2.7 support, a different way to index
projects and several changes to templates which might require updates in
Deprecations and Removals
- Dropped support for Python 2.7.
- project.pt: implement #319 - add link to latest version and summary of
latest version to project view
- doc.pt: implement #339: when viewing older documentation, show link to
latest and stable documentation.
- implement #346: add support for 'latest' and 'stable' versions for
project version URLs
- version.pt, style.css: implement #347 - show navigation link when
newer version is available
- The search indexing is now performed in a separate thread. Indexing
prioritizes private indexes, then projects downloaded from mirrors and
lastly never downloaded projects from mirrors.
- On server start a refresh run for the search index is started
automatically. In addition to that the indexing is done in smaller
batches, which allows resuming of indexing the large pypi project lists.
- In many cases re-indexing is now avoided.
- version.pt, common.js: all list like metadata is now collapsed to ~3
lines instead of only classifiers.
- The ``--recreate-search-index`` option now just removes the whole
index. Upon server restart the index is rebuilt automatically in the
- With devpi-server 5.1.1 or newer, indexing of mirrors will not trigger
a refresh of the simple links for projects which have downloaded
releases. This allows quicker indexing for new replicas and in case the
search index is rebuilt using ``--recreate-search-index``.
- root.pt: added new view for user
- Pass query string on to documentation iframe, this fixes Sphinx search
among other things.
- macros.pt, version.pt, style.css: moved separators to css.
- The minimum supported version of devpi-server is now 5.1.0.
- The interface of ``update_projects`` for search indexers has changed.
The list of projects is now a list of ``ProjectIndexingInfo`` objects
instead of ``dict``. To get the dictionary as before, call
``preprocess_project`` on the object.
- Metrics for the sqlite storage cache are exposed in the JSON output of
the +status view.
- fix #545: provide proper error message when there is an exception
during push to an external repository.
- Fix possible race condition when writing files.
- Fix possible assertion error if importing multiple changes in a
replica fails in the middle and fetching a single change set is tried as
- For plugins the ``offline`` attribute of mirror stage instances now
works independently of the ``--offline-mode`` command line option. This
only applies to ``_perstage`` methods.
- Files created in a transaction are written directly to temporary files
instead of being kept in memory until commit.
- Unnecessary database writes where the final value didn't change are
- The timeout when fetching the list of remote projects for a mirror
index is set to a minimum of 30s by default and to 60s when running as
replica. Other fetches of mirrors still use the timeout specified via
There currently is an issue about mirror_whitelist that needs input from
anyone who uses it together with inheritance (more than just root/pypi
My organization is exploring replacing our current, patched-together
"wheelhouse" with a proper Devpi server mirroring PyPi and hosting our
internal packages. Because this Devpi architecture will be accessed by
hundreds of developers and over 100 build server workers, it needs to be
highly available (able to handle a lot of traffic). So this leads me to
several questions (five major questions and a few sub-questions, to be more
- Is anyone familiar with a threshold of diminishing returns when it comes
to the "threads" server setting? At what point, if any, does increasing
this number do no good? I assume that # of "threads" roughly equates to #
of simultaneous connections that single server is capable of, perhaps minus
one or two. Is this a fair assumption?
- How many "devpi-server" instances can access a single "serverdir?" Is it
just one, and bad things will happen if a second "devpi-server" tries to
access it (and, if I need multiple servers, I should use master-replica
replication)? Or can multiple "devpi-server" instances access a single
"serverdir" to serve as multiple hot masters?
- When employing master-replica replication, I'm wondering if the following
configuration is sensible, or if there are better approaches: 1) One (or
possibly more, if allowed) master, 2) Two or more replicas replicating off
the master, 3) An HTTPS load balancer that sends all traffic to the
replicas (not the master? or include the master in the load balancing?). In
this case, what happens if someone uses the https://load-balanced-devpi-url/
URL to publish a new package? Does a replica it lands on just forward that
up to the master? Or does the user get an error? Or do worse things happen?
- When employing master-replica replication, I noticed in the documentation
this word of caution:
"You probably want to make sure that before starting the replica and/or
master that you also have the Web interface and search plugin installed.
You cannot install it later."
To be clear, I already have "devpi-web" installed (and a theme, too), but
this note confuses (and slightly concerns) me, so I'd like to understand it
better. Why would you be unable to install "devpi-web" after starting a
master or replica? What would it break? Why wouldn't this work? Or do you
just mean that you would have to shut down the servers to install
"devpi-web" (which is understandable)? Is this different from
non-replication environments (can you install "devpi-web" after starting a
non-replicating server), and if so, why? I'm asking for details about this
because I'm curious if it has potential consequences (stability, etc.) that
extend beyond just the "devpi-web" plugin.
- When creating a new index, could someone elaborate a bit more on the
"mirror_whitelist" setting? What is the difference between not setting it
at all (default/implicit) and explicitly setting it to "*"? What does "*"
actually mean? When you upload a new package to the index that conflicts
with a package in its base index, is the resulting behavior that the
uploaded package (or perhaps just the versions you upload) ALWAYS overrides
the one in the base index, regardless of the "mirror_whitelist" setting? My
goal here is to have an index that uses "root/pypi" as its base and also
hosts our internal packages, so that we can point Pipenv to one URL and
install everything from there. Is there a more sensible approach than that
which I am planning on taking?
Thanks in advance for helping me make the right decisions in our upcoming