Pain installing pinned versions of setuptools requirements.
Today, I ran into trouble working with an old project that had six pinned to version 1.1.0. The install failed because buildout tried to install it as 1.10.0 and failed because 1.10.0 was already installed. The problem arose because six's setup.py imports setuptools and then imports six to get __version__. When Buildout runs a setup script, it puts it's own path ahead of the distribution, so the setup script would get whatever version buildout was running. IMO, this is a six bug, but wait, there's more. I tried installing a pinned version with pip, using ``pip install -U six==1.9.0``. This worked. I then tried with version 1.1.0, and this failed, because setuptools wouldn't work with 1.1.0. Pip puts the distribution ahead of it's own path when running a setup script. setuptools requires six >= 1.6, so pip can't be used to install pinned versions (in requirements.txt) earlier than 1.6. Six is a wildly popular package and has been around for a long time. Earlier pins are likely. I raise this here in the broader context of managing clashes between setuptools requirements and requirements of libraries (and applications using them) it's installing. I think Buildout's approach of putting it's path first is better, although it was more painful in this instance. I look forward to a time when we don't run scripts at install time (or are at least wildly less likely to). Buildout is growing wheel support. It should have provided a work around, but: - I happened to be trying to install a 1.1 pin and the earliest six wheel is for 1.. - I tried installing six 1.8. Buildout's wheel extension depended on pip, which depends on setuptools and six. When buildout tries to load the extension, it tries to get the extension's dependencies, which includes six while honoring the version pin, which means it has to install six before it has wheel support. Obviously, this is Buildout's problem, but it illustrates the complexity that arises when packaging dependencies overlap dependencies of packages being managed. IDK what the answer is. I'm just (re-)raising the issue and providing a data point. I suspect that packaging tools should manage their own dependencies independently. That's what was happening until recently IIUC for the pypa tools through vendoring. I didn't like vendoring, but I'm starting to see the wisdom of it. :) Jim -- Jim Fulton http://jimfulton.info
On Apr 2, 2017, at 1:24 PM, Jim Fulton <jim@jimfulton.info> wrote: <Snip>
Can you post this on https://github.com/pypa/setuptools/issues/980? <https://github.com/pypa/setuptools/issues/980?> That’s where most of the discussion from the fall out of setuptools devendoring has concentrated. — Donald Stufft
On Sun, Apr 2, 2017 at 1:29 PM, Donald Stufft <donald@stufft.io> wrote:
On Apr 2, 2017, at 1:24 PM, Jim Fulton <jim@jimfulton.info> wrote: <Snip>
Can you post this on https://github.com/pypa/setuptools/issues/980? That’s where most of the discussion from the fall out of setuptools devendoring has concentrated.
Yup. Jim -- Jim Fulton http://jimfulton.info
participants (2)
-
Donald Stufft
-
Jim Fulton