I uploaded development releases for testing:
These releases are drop-in updates. For devpi-web you might have to run
``--recreate-search-index`` in case you want to go back to a previous
version. Besides that there are no changes preventing a downgrade. But
it's best to test devpi-web on a replica which can be rebuilt if
Speaking of rebuilding replicas: With the ``--replica-file-search-path``
option and the ``--hard-links`` option it is possible to vastly cut down
the replication time when you already have the files on disk. Either
create the new replica in a new folder, or rename the old server folder
to create the new replica in the same place. Then run the new replica
with ``--replica-file-search-path old_location/+files --hard-links`` and
while replicating the new replica will look for existing files in the
specified location and create hard links if it finds one and it's
As the version number already indicates, devpi-web has the biggest
changes, namely dropping Python 2.7 support, a different way to index
projects and several changes to templates which might require updates in
Deprecations and Removals
- Dropped support for Python 2.7.
- project.pt: implement #319 - add link to latest version and summary of
latest version to project view
- doc.pt: implement #339: when viewing older documentation, show link to
latest and stable documentation.
- implement #346: add support for 'latest' and 'stable' versions for
project version URLs
- version.pt, style.css: implement #347 - show navigation link when
newer version is available
- The search indexing is now performed in a separate thread. Indexing
prioritizes private indexes, then projects downloaded from mirrors and
lastly never downloaded projects from mirrors.
- On server start a refresh run for the search index is started
automatically. In addition to that the indexing is done in smaller
batches, which allows resuming of indexing the large pypi project lists.
- In many cases re-indexing is now avoided.
- version.pt, common.js: all list like metadata is now collapsed to ~3
lines instead of only classifiers.
- The ``--recreate-search-index`` option now just removes the whole
index. Upon server restart the index is rebuilt automatically in the
- With devpi-server 5.1.1 or newer, indexing of mirrors will not trigger
a refresh of the simple links for projects which have downloaded
releases. This allows quicker indexing for new replicas and in case the
search index is rebuilt using ``--recreate-search-index``.
- root.pt: added new view for user
- Pass query string on to documentation iframe, this fixes Sphinx search
among other things.
- macros.pt, version.pt, style.css: moved separators to css.
- The minimum supported version of devpi-server is now 5.1.0.
- The interface of ``update_projects`` for search indexers has changed.
The list of projects is now a list of ``ProjectIndexingInfo`` objects
instead of ``dict``. To get the dictionary as before, call
``preprocess_project`` on the object.
- Metrics for the sqlite storage cache are exposed in the JSON output of
the +status view.
- fix #545: provide proper error message when there is an exception
during push to an external repository.
- Fix possible race condition when writing files.
- Fix possible assertion error if importing multiple changes in a
replica fails in the middle and fetching a single change set is tried as
- For plugins the ``offline`` attribute of mirror stage instances now
works independently of the ``--offline-mode`` command line option. This
only applies to ``_perstage`` methods.
- Files created in a transaction are written directly to temporary files
instead of being kept in memory until commit.
- Unnecessary database writes where the final value didn't change are
- The timeout when fetching the list of remote projects for a mirror
index is set to a minimum of 30s by default and to 60s when running as
replica. Other fetches of mirrors still use the timeout specified via
There currently is an issue about mirror_whitelist that needs input from
anyone who uses it together with inheritance (more than just root/pypi
My organization is exploring replacing our current, patched-together
"wheelhouse" with a proper Devpi server mirroring PyPi and hosting our
internal packages. Because this Devpi architecture will be accessed by
hundreds of developers and over 100 build server workers, it needs to be
highly available (able to handle a lot of traffic). So this leads me to
several questions (five major questions and a few sub-questions, to be more
- Is anyone familiar with a threshold of diminishing returns when it comes
to the "threads" server setting? At what point, if any, does increasing
this number do no good? I assume that # of "threads" roughly equates to #
of simultaneous connections that single server is capable of, perhaps minus
one or two. Is this a fair assumption?
- How many "devpi-server" instances can access a single "serverdir?" Is it
just one, and bad things will happen if a second "devpi-server" tries to
access it (and, if I need multiple servers, I should use master-replica
replication)? Or can multiple "devpi-server" instances access a single
"serverdir" to serve as multiple hot masters?
- When employing master-replica replication, I'm wondering if the following
configuration is sensible, or if there are better approaches: 1) One (or
possibly more, if allowed) master, 2) Two or more replicas replicating off
the master, 3) An HTTPS load balancer that sends all traffic to the
replicas (not the master? or include the master in the load balancing?). In
this case, what happens if someone uses the https://load-balanced-devpi-url/
URL to publish a new package? Does a replica it lands on just forward that
up to the master? Or does the user get an error? Or do worse things happen?
- When employing master-replica replication, I noticed in the documentation
this word of caution:
"You probably want to make sure that before starting the replica and/or
master that you also have the Web interface and search plugin installed.
You cannot install it later."
To be clear, I already have "devpi-web" installed (and a theme, too), but
this note confuses (and slightly concerns) me, so I'd like to understand it
better. Why would you be unable to install "devpi-web" after starting a
master or replica? What would it break? Why wouldn't this work? Or do you
just mean that you would have to shut down the servers to install
"devpi-web" (which is understandable)? Is this different from
non-replication environments (can you install "devpi-web" after starting a
non-replicating server), and if so, why? I'm asking for details about this
because I'm curious if it has potential consequences (stability, etc.) that
extend beyond just the "devpi-web" plugin.
- When creating a new index, could someone elaborate a bit more on the
"mirror_whitelist" setting? What is the difference between not setting it
at all (default/implicit) and explicitly setting it to "*"? What does "*"
actually mean? When you upload a new package to the index that conflicts
with a package in its base index, is the resulting behavior that the
uploaded package (or perhaps just the versions you upload) ALWAYS overrides
the one in the base index, regardless of the "mirror_whitelist" setting? My
goal here is to have an index that uses "root/pypi" as its base and also
hosts our internal packages, so that we can point Pipenv to one URL and
install everything from there. Is there a more sensible approach than that
which I am planning on taking?
Thanks in advance for helping me make the right decisions in our upcoming
another quick question about tox testing results:
Given I have `results.json` file from a tox run, can I upload that to a devpi index without running the tests again?
Background is that we build our packages in our CI system, test them without uploading them to an index, then if the tests succeed, we only upload the wheels to our devpi index. The way I have read the devpi documentation, displaying test results on the index always requires to upload a source distribution to that index (or optionally a "testing" index), then run the `devpi test` for that source distribution on that index, then (in case of the "testing" index) push the tested package to a production index if tests were successful and delete the package from the testing index. The two quarrels I have with that is, that it requires a source distribution on an inde and that faulty packages are on an index and need to be deleted or passing packages need to be pushed. So can I skip the testing part of `devpi test` and just upload the result from somewhere else along with the tested wheel?
quick question: If I want to whitelist everything but a few packages, is that possible with devpi indexes and if so what is the syntax?
E.g. i'd expect:
`mirror_whitelist=['*', '!somepack', '!mypackage']`
would whitelist things like `pip` and `setuptools` but not the two explicitly forbidden packages named somepack and mypackage.
Cheers and thx,
Apologies for this being last minute, but I'm currently attending the PyConAU dev sprints taking place today and tomorrow, 05Aug-06Aug UTC+10.
I plan on working on the following:
* My existing PR for the ldap auth plugin (https://github.com/devpi/devpi-ldap/pull/43), it needs refinement.
* Adding info on bootstrapping an LDAP server.
* Adding contributing information to the documentation.
If there's time, I'm might try cherry-picking a few other things from the issues list.
I'll be lurking in the devpi IRC channel throughout the event. I might be able to wrangle some other to assist as well.
I created a PR for a new request events hook at
The problem with the current hooks devpiserver_stage_created,
devpiserver_on_changed_versiondata and devpiserver_on_upload is that
they are database events. That means when data is imported or
replicated, they are called again. That is also why they don't have
access to the request and thus infos like the logged in user. The
devpiserver_on_upload_sync is called from within the request, but it may
be called prematurely, because the current transaction wasn't commited
and there could be an error. There is also a possible race condition due
to that, because the file may not be accessible when the triggered
plugin does something. Again the infos the hook currently gets is
limited as well.
In the proposed implementation the event handlers are called after the
data was sent to the client. This was done to ensure a plugin won't
block normal operation. The client always gets the result of the request
first. The downside is that when the client needs a long time to fetch
the response, the event handlers are delayed. It might be useful to call
the handlers in a separate thread while the client is served. Comments
on this are welcome. For now I wanted to avoid the added complexity.
Another change from the old hooks is, that it is possible an event
handler isn't called. This only happens when devpi-server is stopped
after commit, but before the event handlers were called. I consider this
a minor issue, but again, comments on this are welcome.
Thank you for the project, quick question:
I see that restrict-modify actually only allows changes made by the principals listed there. Which kinda means that you are either root or you aren't, should restrict-modify also allow index_create/user_modify for authenticated users?