devpi-server-2.1.2 and devpi-web-2.2.1 bring a host of fixes to
the private pypi server system. You can upgrade without migrating
your data if you run already with devpi-server-2.1.X.
Find docs as usual at:
Many thanks to Florian Schulze who did most of the changes in devpi-web.
holger krekel, merlinux GmbH
- fix issue172: avoid traceback when user/index/name/version is accessed.
- fix issue170: ensure that we parse the prospective pip-6.0 user agent
string properly so that using the username/index url works with pip.
Thanks Donald Stufft and Florian Schulze.
- fix issue158: redirect to normalized projectname for all GET views.
- fix issue169: change /+status to expose "event_serial" as "the last
event serial that was processed". document "serial" and
"event-serial" and also refine internals wrt to "event-serial" so that
it means the "last serial for which events have been processed"
- require devpi-server>=2.1.2
- fix issue175: use normalized name of projects, so redirects from unnormalized
names works. NOTE that if you had issues with documentation uploads
not appearing because of normalization issues ("-" or "_" appearing
in the name for example) you need to re-upload the docs or
do a full export/import cycle.
- fix view when tox results can not be parsed.
- version.pt: removed "code" tag around overwrite count.
- macros.pt: added "footer" tag around the whole footer part.
- version.pt: moved file type, python version and size info from their own
columns into the file column.
- version.pt: moved history column from before the tox results column to behind
the tox results.
- version.pt: removed "last modified" from history column
- version.pt: removed timestamp from "replaced" action in history column
- version.pt: add link to PyPI page if applicable.
- fix project page view if there are downloads with filenames which can't be
parsed as packages with version number
- fix notfound-redirect when serving under an outside URL with a sub path
On Thu, Oct 16, 2014 at 13:21 +0200, Florian Schulze wrote:
> I guess the dependencies would be stored with the same info as the
> test results, like platform, python version etc?
Indeed, enriching the tox result file and then parsing/reading it on the
server side is probably the way to go. One related question is how to
store the tox result files. Currently they are just tied to the sdist link
(wheels cannot have tox.ini files to begin with) which was obtained from
"devpi test" as the best matching link. But i think server-side we need
to detach it (again) from the sdist link and rather store them indexed
by the MD5 of the sdist content and also by the md5 of of all dependencies.
The idea is to be able to efficiently:
- query all test results where this release file (or release version)
was a part of the (test) dependencies
- query all "testing tasks" caused by changed dependencies (this could
be polled from a new "devpi test-server" subcommand which executes
"pending testing tasks" for the platform it runs on)
- show and record dependency/test information even for root/pypi
projects (like we used to)
Changing the server storage and access methods for test results does not
neccessarily change the client facing UI i think. We can still provide the
same json but we produce it differently.
> The only "unsafe" part I can think of is a general issue, malicious
> archives which exhaust server resources.
We can have an upper found because 640MB archives should be enough
for anyone! But i think for wheels the reading of the metadata does not
require full unpacking but just some lightweight iteration, anyway :)
> Florian Schulze
> On 16 Oct 2014, at 13:08, holger krekel wrote:
> >Hi all,
> >FYI i thought about how we make devpi-server know about which project
> >depends on which other projects. With that information we could do
> >all kinds of good things:
> >- display depency info on the per-project web page or along with
> >release files
> >(this project depends on ProjectY and ProjectZ)
> >- display if all recent versions of deps are properly working
> >and tested with a project's latest release
> >- could trigger server-side "dependency changed" events so that
> >for example
> >a tox run could be triggered for the new test configuration
> >- create pin-versioned requirement files that could be used
> >with "pip install -r tested-requirements.txt", and/or possibly a UI
> >like "devpi rinstall X" where it would query the latest set of
> >for which tests passed, download all according files and then run
> >"pip install --no-index FILE1 FILE2 [...]" which wouldn't require
> >any more network access.
> >Question is how to best get the (closure) set of dependencies for a
> >project. I cam currently pondering the following possibilities to
> >obtain the information at server side:
> >- if the project has release files as wheels, look at wheel metadata
> >which lists deps (requires just virtually unzipping a wheel and
> >at safe metadata files)
> >- "devpi test" could run "setup.py egg_info" and send the requirements
> >it finds to the server (requires login), additionally it should
> >"pip list" all test dependencies in the respective tox environments
> >and add them as well because if test dependencies change, tests should
> >be re-run as well.
> >These two methods would not require any change in client-facing UI and
> >allow us to get and display the dependencies information.
> >Any comments or thoughts on the matter welcome.
> >You received this message because you are subscribed to the Google
> >Groups "devpi-dev" group.
> >To unsubscribe from this group and stop receiving emails from it,
> >send an email to devpi-dev+...(a)googlegroups.com.
> >To post to this group, send email to devp...(a)googlegroups.com.
> >Visit this group at http://groups.google.com/group/devpi-dev.
> >For more options, visit https://groups.google.com/d/optout.
FYI i thought about how we make devpi-server know about which project
depends on which other projects. With that information we could do
all kinds of good things:
- display depency info on the per-project web page or along with release files
(this project depends on ProjectY and ProjectZ)
- display if all recent versions of deps are properly working
and tested with a project's latest release
- could trigger server-side "dependency changed" events so that for example
a tox run could be triggered for the new test configuration
- create pin-versioned requirement files that could be used
with "pip install -r tested-requirements.txt", and/or possibly a UI
like "devpi rinstall X" where it would query the latest set of dependencies
for which tests passed, download all according files and then run
"pip install --no-index FILE1 FILE2 [...]" which wouldn't require
any more network access.
Question is how to best get the (closure) set of dependencies for a
project. I cam currently pondering the following possibilities to
obtain the information at server side:
- if the project has release files as wheels, look at wheel metadata
which lists deps (requires just virtually unzipping a wheel and looking
at safe metadata files)
- "devpi test" could run "setup.py egg_info" and send the requirements
it finds to the server (requires login), additionally it should probably
"pip list" all test dependencies in the respective tox environments
and add them as well because if test dependencies change, tests should
be re-run as well.
These two methods would not require any change in client-facing UI and
allow us to get and display the dependencies information.
Any comments or thoughts on the matter welcome.
the caching and private pypi server, devpi-server-2.1.1, is out and
fixes some bugs, see changelog below.
It is fully backward compatible, no export/import cycle required.
For more info, see http://doc.devpi.net.
holger krekel, merlinux GmbH
- fix replication issue reported by a customer: if a replica lags
behind a master and a file was created and then deleted meanwhile,
the replica could get stuck with a FileReplicationError. We now
let the master report a 410 GONE code so that the replica knows
it can safely proceed because the file was deleted later anyways.
- generate "systemd" configuration example when "--gen-config" is issued.
Thanks Pavel Sedlak.
- fix issue109: fix relative URLs in simple index pages and 404 errors on
uploading toxresults and downloading files when serving under an outside URL
with a sub path. Thanks to Joe Holloway for detailed infos.
- drop limitation on maximum documentation size. Body size is now only
controlled by frontends such as nginx. Thanks Stephan Erb.
- use newer version of virtualenv for jenkins trigger. Thank brunsgaard.
I'm running devpi in docker, and I wrote a little project to manage the
docker containers. It serves devpi through nginx with https by default, and
my idea was to simplify the installation and maintenance of devpi. Maybe is
useful to someone else, a link to the project:
José Luis Lafuente
I submitted this pull request
to devpi, which would cause devpi to "shadow" package versions across
indexes. I will quote my description of the pull request here:
Say I have package 'a' on index 'x' which lists index 'y' as a base index.
That is, 'x' inherits from 'y'. A common use case is to want to install
package 'a' from 'x' and 'y'. If I wanted the version of 'a' listed on in
the 'y' index, I would have listed that index in my pip configuration. But
I want the version of 'a' listed in index 'x', not 'y'.
Currently, if I as a user point pip to index 'x', pip will only install 'a'
from 'x' if the package 'a' is of a later version than the package 'a'
listed in 'y'. But in our scenario, pip should always install the package
'a' listed in index 'x', not 'y', else there would be no need for multiple
repositories at all.
This patch fixes that. With this patch, if devpi is queried for a package
'a' from index 'x', it will get package 'a' from 'x' every time, even if a
newer version of 'a' is listed in index 'y'.
The pull request in its current form would have changed the way devpi pulls
packages, and would have broken important workflows. However, for other
workflows, I feel this feature is relevant and handy. I therefore request
1. The "shadow indexes" feature be added to devpi, although by default
it be *turned off*.
2. The feature should/could be enabled either by command line option or
I think that index shadowing isn't something for everyone, but I think a
lot of people would find it useful. We certainly would in our shop.
Is it possible (or desirable) that devpi be able to trigger builds of
wheels for packages that it caches? I realise it'd be possible as part of a
Jenkins trigger for an upload, but this is for cached packages from PyPI.
The actual building could be complex - in the case of current Linux wheels,
we'd need to have separate indexes for the wheels for Debian, RHEL, Ubuntu,
SUSE, etc. since the wheels are typically not compatible. That complexity
could be external to devpi - it's just a case of getting the trigger to
indicate that a build is desired.
Quite how this integrates with a user just doing a "pip install package" is
a little beyond me for the moment - the delay between the cache event and
build completing could be significant. In light of that it might be
desirable to have a separate process which does all this work (fetches the
latest package and performs the build). Has anyone done this? How would it
keep up to date?
I'm having an issue where the devpi-web doesn't know when I'm connecting
over https, and server the links as http
This breaks the stylesheets and iframes for documentation.
How do I get it to convert the links to https?
Currently you can't use "devpi test" with wheel files because they
don't contain the tests unless you put all tests inside the software
archive. To get the tests we could require that there is an "sdist"
archive file containing the "tox.ini" file and the tests. Thus
"devpi test X" would potentially download two files, a platform
specific one and the sdist to configure and perform testing.
Apart from running the tests against the sdist we probably also
want to run the tests against the installed sdist. Not sure
i grasp all implications yet (also UI wise) but does anyone
of you have preliminary thoughts on the issue?