Dan and I had been doing most of the maintenance work for Pipenv recently, and as Dan mentioned,
we have been working on some related projects that poke into pip internals significantly, so I feel I
should voice some opinions. I have significantly less experience messing with pip than Dan, and might
be able to offer a slightly different perspective.
Pipenv mainly interacts with pip for two things: install/uninstall/upgrade packages, and to gain information
about a package (what versions are available, what dependencies does a particular version has, etc.).
For the former case, we are currently using it with subprocesses, and it is likely the intended way of
interaction. I have to say, however, that the experience is not flawless. pip has a significant startup time,
and does not offer chances for interaction once it is started on running, so we really don’t have a good
way to, for example, provide installation progress bar for the user, unless we parse pip’s stdout directly.
These are not essential to Pipenv’s functionality, however, so they are more like an annoyance rather
than glaring problems.
The other thing Pipenv uses pip for—getting package information—is more troubling (to me, personally).
Pipenv has a slightly different need from pip regarding dependency resolution. pip can (and does) freely
drop dependencies that does not match the current environment, but Pipenv needs to generate a lock file
for an abstract platform that works for, say, both macOS and Windows. This means pip’s resolver is not
useful for us, and we need to implement our own. Our own resolver, however, still needs to know about
packages it gets, and we are left with two choices: a. try re-implement the same logic, or b. use pip internals
to cobble something together.
We tried to go for a. for a while, but as you’d easily imagine, our own implementation is buggy, cannot
handle edge cases nearly as well, and fielded a lot of complaints along the lines of “I can do this in pip, why
can’t I do the same in Pipenv”. One example is how package artifacts are discovered. At my own first
glance, I thought to myself this wouldn’t be that hard—we have a simple API, and the naming conventions are
there, so as long as we specify sources in Pipfile (we do), we should be able to discover them no problem.
I couldn’t be more wrong. There are find_links, dependency_links, pip.conf for the user, for the machine, all
sorts of things, and for everything quirk in pip we don’t replicate 100%, issues are filed urging use to fix it.
In the end we gave up and use pip’s internal PackageFinder instead.
This is a big problem going forward, and we are fully aware of that. The strategy we are taking at the
moment is to try to limit the surface area of pip internals usage. Dan mentioned we have been building a
resolver for Pipenv, and we took the chance to work toward centralising things interfacing with pip
internals. Those are still internals, of course, but we now have a relatively good idea what we actually need
from pip, and I’d be extremely happy if some parts of pip can come out as standalone with official blessing.
The things I am particularly interested in (since they would be beneficial for Pipenv) are:
* WheelBuilder (and everything that comes with it like the wheel cache, preparer, unpack_url, etc.)
Sorry for the very long post, but I want to get everything out so it might be easier to paint a complete picture
of the state we are currently in.
: https://github.com/sarugaku/passa <https://github.com/sarugaku/passa>
Tzu-ping Chung (@uranusjr)
> On 21/8, 2018, at 00:00, distutils-sig-request(a)python.org wrote:
> Send Distutils-SIG mailing list submissions to
> To subscribe or unsubscribe via the World Wide Web, visit
> or, via email, send a message with subject or body 'help' to
> You can reach the person managing the list at
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Distutils-SIG digest..."Today's Topics:
> 1. Re: pipenv and pip (Dan Ryan)
> 2. Re: pipenv and pip (Dan Ryan)
> From: Dan Ryan <dan(a)danryan.co>
> Subject: [Distutils] Re: pipenv and pip
> Date: 20 August 2018 at 22:04:11 GMT+8
> To: Chris Jerdonek <chris.jerdonek(a)gmail.com>
> Cc: distutils sig <distutils-sig(a)python.org>
> The truth is that it’s basically impossible to gauge bugs in pip vs bugs in our patches to it which are often a lot more likely — reproductions of edge cases can be impossible but there are specific things I know we broke (like parsing certain kinds of extras, previously) — mostly bugs land in pips issue tracker before we report them or we will direct people there. We have like 2 active maintainers and we are maintaining like 15 pipenv related projects so we normally just point people at pip rather than file an issue. I am usually on irc as well if needed, and I often ask for clarification there
> Dan Ryan // pipenv maintainer
> gh: @techalchemy
>> On Aug 20, 2018, at 4:32 AM, Chris Jerdonek <chris.jerdonek(a)gmail.com> wrote:
>> Thanks. Is the state of affairs as you described them what you're
>> planning for the future as well, or do you anticipate any changes
>> worthy of note?
>> Also, are any of the bugs filed in pipenv's tracker due to bugs or
>> rough spots in pip -- is there a way to find those, like by using a
>> label? It would be good to be able to know about those so pip can
>> improve and become more useful. It doesn't seem like any bugs have
>> been filed in pip's tracker in the past year by any of pipenv's top
>> contributors. That seems a bit surprising to me given pipenv's heavy
>> reliance on pip (together with the fact that I know pip has its share
>> of issues), or is there another way you have of communicating
>> regarding things that interconnect with pip?
>>> On Mon, Aug 20, 2018 at 12:51 AM, Dan Ryan <dan(a)danryan.co> wrote:
>>> Sure I can grab that— we patch pip because we use some internals to handle resolution and we have some bugs around that currently. They aren’t upstreamed because they aren’t actually present in pip, only in pipenv. Pipenv crosses back and forth across the virtualenv boundary during the process. Pipenv relies on piptools and vendors a patched version of pip to ensure consistency as well as to provide a few hacks around querying the index. We do have a bit of reimplementation around some kinds of logic, with the largest overlap being in parsing of requirements.
>>> As we handle some resolution, which isn’t really something pip does, there is no cli interface to achieve this. I maintain a library (as of last week) which provides compatibility shims between pip versions 8-current. It is a good idea to use the cli, but we already spend enough resources forking subprocesses into the background that it is a lot more efficient to use the internals, which I track quite closely. The preference toward cli interaction is largely to allow internal api breakage which we don’t mind.
>>> For the most part, we have open channels of communication as necessary. We rely as heavily as we can on pip, packaging, and setuptools to connect the dots, retrieve package info, etc.
>>> Dan Ryan // pipenv maintainer
>>> gh: @techalchemy
>>>> On Aug 20, 2018, at 2:41 AM, Chris Jerdonek <chris.jerdonek(a)gmail.com> wrote:
>>>> Can someone explain to me the relationship between pipenv and pip,
>>>> from the perspective of pipenv's maintainers?
>>>> For example, does pipenv currently reimplement anything that pip tries
>>>> to do, or does it simply call out to pip through the CLI or through
>>>> its internal API's? Does it have any preferences or future plans in
>>>> this regard? How about upstreaming to pip fixes or things that would
>>>> help pipenv?
>>>> I've been contributing to pip more lately, and I had a look at
>>>> pipenv's repository for the first time today.
>>>> Given that pip's code was recently made internal, I was a bit
>>>> surprised to see that pipenv vendors and patches pip:
>>>> Before I had always assumed that pipenv used pip's CLI (because that's
>>>> what pip says you should do).
>>>> I also noticed that some bugs in pipenv's tracker seem closely related
>>>> to pip's behavior, but I don't recall seeing any bugs or PR's in pip's
>>>> tracker reported from pipenv maintainers.
>>>> Without knowing a whole lot more than what I've stated, one concern I
>>>> have is around fragmentation, duplication of work, and repeating
>>>> mistakes (or introducing new ones) if a lot of work is going into
>>>> pipenv without coordinating with pip. Is this in any way similar to
>>>> the beginning of what happened with distutils, setuptools, and
>>>> distribute that we are still recovering from?
>>>> Distutils-SIG mailing list -- distutils-sig(a)python.org
>>>> To unsubscribe send an email to distutils-sig-leave(a)python.org
>>>> Message archived at https://firstname.lastname@example.org/message/…
> From: Dan Ryan <dan(a)danryan.co>
> Subject: [Distutils] Re: pipenv and pip
> Date: 20 August 2018 at 22:09:23 GMT+8
> To: Paul Moore <p.f.moore(a)gmail.com>
> Cc: Distutils <distutils-sig(a)python.org>
> How would I (said library) maintain compatibility? I’m pretty clever. The shim library doesn’t actually do anything besides provide import paths. If I shim something that didn’t exist, it shims None for any pip versions where it doesn’t exist. So for example if you are running pip 9 and you import RequirementTracker from the shims library you just import None
> Dan Ryan // pipenv maintainer
> gh: @techalchemy
>> On Aug 20, 2018, at 8:15 AM, Paul Moore <p.f.moore(a)gmail.com> wrote:
>>> On Mon, 20 Aug 2018 at 12:25, Wes Turner <wes.turner(a)gmail.com> wrote:
>>> On Monday, August 20, 2018, Paul Moore <p.f.moore(a)gmail.com> wrote:
>>>> I know "security by obscurity" doesn't work, but I'm happier if
>>>> details of this library *aren't* widely known - it's not something I'd
>>>> want to encourage people using, nor is it supported by pip, as it's
>>>> basically a direct interface into pip's internal functions, papering
>>>> over the name changes that we did in pip 10 specifically to dissuade
>>>> people from doing this.
>>> If someone was committing to identifying useful API methods, parameters, and return values;
>>> writing a ~PEP;
>>> implementing said API;
>>> and maintaining backwards compatible shims for some reason;
>>> would something like `pip.api` be an appropriate namespace?
>>> (now that we're on version 18 with a faster release cycle)?
>> I'm not quite sure I know what you mean here. The key point is that
>> pip 18.0 might have an internal function pip._internal.xxx, and in pip
>> 18.1 there's no such function, and the functionality doesn't even
>> exist any more. How would a 3rd party project maintain backwards
>> compatible shims in the face of that? Agreed it's not likely in
>> practice - but we're not going to guarantee it.
>> To be honest I don't see the point of discussing pip's internal API.
>> It's just that - internal. I'd rather discuss useful (general)
>> packaging libraries, that tools can build on - pip can vendor those
>> and act as (just) another consumer, rather than getting into debates
>> about support and internal APIs.
>> Distutils-SIG mailing list -- distutils-sig(a)python.org
>> To unsubscribe send an email to distutils-sig-leave(a)python.org
>> Message archived at https://email@example.com/message/…
> Distutils-SIG mailing list -- distutils-sig(a)python.org
> To unsubscribe send an email to distutils-sig-leave(a)python.org
I'd like to run different external programs during the install command
which are provided by (potentially) installed extras, but I cannot find
where the currently selected extras are stored.
The only thing I quickly found by going through the distutils and
setuptools codebases is an old TODO from easy_install indicating it might
not have been implemented yet:
I can hack this by try excepting on ImportError, but is there a more
elegant (and intended) way?
Can someone explain to me the relationship between pipenv and pip,
from the perspective of pipenv's maintainers?
For example, does pipenv currently reimplement anything that pip tries
to do, or does it simply call out to pip through the CLI or through
its internal API's? Does it have any preferences or future plans in
this regard? How about upstreaming to pip fixes or things that would
I've been contributing to pip more lately, and I had a look at
pipenv's repository for the first time today.
Given that pip's code was recently made internal, I was a bit
surprised to see that pipenv vendors and patches pip:
Before I had always assumed that pipenv used pip's CLI (because that's
what pip says you should do).
I also noticed that some bugs in pipenv's tracker seem closely related
to pip's behavior, but I don't recall seeing any bugs or PR's in pip's
tracker reported from pipenv maintainers.
Without knowing a whole lot more than what I've stated, one concern I
have is around fragmentation, duplication of work, and repeating
mistakes (or introducing new ones) if a lot of work is going into
pipenv without coordinating with pip. Is this in any way similar to
the beginning of what happened with distutils, setuptools, and
distribute that we are still recovering from?
According to the documentation, TestPyPI is a complete independent instance from PyPI, and they don’t share databases at all. You need to register a separate account there.
Tzu-ping Chung (@uranusjr)
Sent from my iPhone
> On 16 Aug 2018, at 23:55, tom(a)tombaker.org wrote:
> When I follow the instructions at
> to upload my distribution archives with
> $ twine upload --repository-url https://test.pypi.org/legacy/ dist/*`
> I get prompted for my username and password, but the upload aborts with
> `HTTPError: 403 Client Error` (see below) even though I can successfully log
> into https://pypi.org using the same username and password.
> Can anyone advise?
> Uploading distributions to https://test.pypi.org/legacy/
> Enter your username: tombaker
> Enter your password:
> Uploading mklists-0.1.1-py3-none-any.whl
> 100%|*************************************************************************| 10.6k/10.6k [00:00<00:00, 15.8kB/s]
> HTTPError: 403 Client Error: Invalid or non-existent authentication information. for url: https://test.pypi.org/legacy/
When I follow the instructions at
to upload my distribution archives with
$ twine upload --repository-url https://test.pypi.org/legacy/ dist/*`
I get prompted for my username and password, but the upload aborts with
`HTTPError: 403 Client Error` (see below) even though I can successfully log
into https://pypi.org using the same username and password.
Can anyone advise?
Uploading distributions to https://test.pypi.org/legacy/
Enter your username: tombaker
Enter your password:
100%|*************************************************************************| 10.6k/10.6k [00:00<00:00, 15.8kB/s]
HTTPError: 403 Client Error: Invalid or non-existent authentication information. for url: https://test.pypi.org/legacy/
I'm updating some instructions for my students, in which the first thing I
do is have them run ensurepip:
$ python3 -m ensurepip --upgrade
which resulted in:
$ python3 -m ensurepip --upgrade
Looking in links:
Requirement already up-to-date: setuptools in
Requirement already up-to-date: pip in
(this is after a brand-new python 3.7 install on OS-X from python.org)
All good. But then I use pip, and get (after a successful install):
You are using pip version 10.0.1, however version 18.0 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Huh? shouldn't ensurepip have updated it for me already??
Or should I simply suggest the
pip install --upgrade pip
command and not bother with ensurepip anymore?
BTW -- shouldn't that be:
python3 -m pip install --upgrade pip
to make sure they get the "right" pip?
KInda hard to keep up with the latest ...
Christopher Barker, Ph.D.
Emergency Response Division
NOAA/NOS/OR&R (206) 526-6959 voice
7600 Sand Point Way NE (206) 526-6329 fax
Seattle, WA 98115 (206) 526-6317 main reception
>> On Jul 24, 2018, at 4:36 AM, Nick Coghlan <ncoghlan(a)gmail.com> wrote:
>> However, there *are* folks that have been working on allowing
>> applications to be defined primarily as Python projects, and then have
>> the creation of wrapper native installers be a pushbutton exercise,
>> rather than requiring careful human handholding.
> But it sounds like they also want to be able to install/remove/upgrade
> *parts* of the Python project, for their plugin support.
That’s correct. We currently have 18 official plugins for Certbot with plans to add more and a few dozen third-party plugins.
> Do any of these tools allow that?
This is a good question. If we went with something like dh-virtualenv or packaged virtualenvs with fpm, would we be able to have separate packages for our plugins that installed things in the same virtualenv? I haven’t looked into this yet, but I wouldn’t expect this to work.
> That's the thing that really made me think about conda.
My biggest concern with conda right now is I believe we (or our users) would be on their own for setting up a cron job or systemd timer for renewal. Is this correct?
> On Jul 24, 2018, at 11:20 PM, Chris Jerdonek <chris.jerdonek(a)gmail.com> wrote:
> Just to be clear, I wasn't meaning to promote or recommend the Docker
> option I described.
Sure! After reading your 2nd email describing how this would work in more detail, I think this would require a pretty major rewrite to how Certbot currently works. Given the other downsides, I’m not sure this is the best approach for us, but I appreciate you throwing out the idea regardless just in case it was!
> On Jul 26, 2018, at 3:20 AM, Ben Finney via Distutils-SIG <distutils-sig(a)python.org> wrote:
> Just focus on Certbot, and cheer from the sidelines as the OS distributions
> do the work of third party packages.
> Yes, that's a different set of problems (for example, keeping Certbot
> compatible with those versions of the libraries the OS repositories
We’ve done the work to maintain compatibility with the older versions of our dependencies available in the official OS repos where we are packaged. The source of our problems with official OS packaging is described below and in the Google Doc linked in my initial email.
> On Jul 28, 2018, at 8:53 AM, matthew(a)woodcraft.me.uk wrote:
> I wouldn't be too put off by the idea of Debian politics. Certbot should be a good fit for stable-updates:
We thought so too and getting updates like this was our main packaging plan when we launched in 2015. Unfortunately, it hasn’t gone well and is the main reason we’re seeking our own solution. Perhaps we did something incorrectly, but as Nathaniel pointed out, Certbot is broken in Debian Stretch and has been since January. The same and many other problems affect the packages in Ubuntu Xenial. We’ve also struggled to find people to help maintain our PPA for older, non-EOL’d versions of Ubuntu.
If anyone reading this would like to help us solve these problems or know someone who would, please reach out off-list. While these packages exist in OS repos, some users will continue to use them regardless of the alternative packaging approach we take. Unless the current issues are resolved and we’re confident new ones in the future will be fixed quickly as well, I think we need to offer alternative packaging so we can provide our users some means of getting a working version of Certbot.
> On Jul 28, 2018, at 12:00 PM, Wes Turner <wes.turner(a)gmail.com> wrote:
> Took a minute more to read the gdoc link.
Wes Turner, while I don’t have any specific questions or comments right now, thank you very much for all the ideas and links.
On 2018-08-06 19:49, Chris Jerdonek wrote:
> Does this proposal mean that .pyc files would need to be regenerated
> after installing from a wheel (because the timestamp inside the .pyc
> file would no longer match the .py file from which it came)?
No, I specifically mentioned that the timestamps of .py files would be
Proposal: change timestamps of files installed from a wheel to the time
of installation, except for .py files.
Let's consider two ways of installing a Python package: installation
from a source tarball and installation from a wheel, in both cases using
The installation roughly works as follows:
(A) Create a temporary directory for pip to work in, with a filename
(B) Extract the archive (sources or wheel) somewhere in the temporary
(C) Build the package in the temporary directory (in both cases, this
involves creating .pyc files for the .py files)
(D) Copy the installation from the temporary directory to the final
installation directory (typically somewhere in site-packages)
(E) Delete the temporary directory
Both (B) and (D) involve a copy operation of some sort, so the question
is: what to do with timestamps? In both cases, (D) currently preserves
the timestamp of the files. For (B), timestamps are preserved when
extracting a tarball but not when extracting a wheel.
Change step (D) for wheel installs to NOT preserve timestamps. It might
be safer to make sure that all installed files have the same timestamp,
so this would mean: give all installed files the timestamp of a fixed
time, namely the time when step (D) was started in the installation process.
Important exception: for .py files, do preserve the timestamp.
I think that (B) should preserve timestamps. However, that is a
different issue which should be fixed independently.
For a source installation, the timestamps are preserved in both (B) and
(D), so the timestamp of the installed files can be much older than the
time of installation.
This is a problem for dependency checking: suppose that I am the
developer of a package MYPKG which has a dependency on the C API of a
package OTHERPKG. The C API of OTHERPKG is defined by .h files which are
installed by OTHERPKG. The package MYPKG correctly declares this
dependency using the "depends" keyword of the Extension objects. Now,
when I install a different version of OTHERPKG, I want that distutils
notices that the dependency has been changed and automatically rebuilds
the Extensions with that dependency.
This requires that the timestamp of the installed .h files is more
recent than the timestamp of the compiled extension. In other words, the
timestamp of installed files should be the time of installation.
A similar situation happens for a Cython API exposed by installed .pxd
(and possibly also .h) files.
**Comparison with automake**
Automake (which is probably the most widely used build system for open
source/free software) installs files with timestamp the time of
installation. So, my proposal is doing what automake already does.
**The strange case of .pyc files**
As noted in https://github.com/pypa/pip/issues/5648 it is important that
the timestamp of .py files are preserved after the .pyc file is created.
This is because Python checks the exact timestamp of the .py file to
determine whether it is up to date.
This explains the exception for .py files in the proposal.
In other words: won't this break packages which rely on timestamps to be
When installing a wheel, timestamps from the wheel are already not
preserved in step (B). So any package which would rely on preserving
timestamps would already be broken when installed as wheel. So this
proposal can only break stuff which depends on timestamps from step (C),
which happens to be the case for .py files.
In my proposal, I'm only suggesting to make changes for wheel installs.
I do think that the change makes sense for installs from source, but
that will be irrelevant after PEP 517 is implemented anyway.