Hi All, pip is great for installing things, except when there are C extensions. If I run "pip install pillow" out of the box, I get a bunch of information showing up on the console, with an error in there somewhere. The problem is that I don't have the necessary header files and libraries installed, but the error message doesn't really say that. Having ran into the problem enough, I have now learned I need to go to pillow's website and read up on what I need to install. Is there any way packages could list external dependencies? Maybe something like requires = ['Python.h', 'somethingelse.h']. I realize there's not much that pip can do about it automatically, but maybe pip could try to figure out if you have those files around and warn if it doesn't see them. Eventually, I could imagine a distro-specific tool coming along that could more-reliably translate those dependencies into system packages. Something like: yum whatprovides '/usr/include/*/Python.h' But, for starters, I think somehow listing external dependencies would be helpful. Ideally it would be great if "pip install pillow" Just Worked, though I imagine that would be pretty impossible. We would likely need to start packaging the c dependencies on pypi, and would probably start approximating something like Gentoo Portage, which I assume is a road we don't want to go down. Is there anything we can do to improve the situation? Thanks, Collin
On 9 September 2014 16:59, Collin Anderson
Is there any way packages could list external dependencies? Maybe something like requires = ['Python.h', 'somethingelse.h']. I realize there's not much that pip can do about it automatically, but maybe pip could try to figure out if you have those files around and warn if it doesn't see them.
I don't think pip could realistically check if such files exist (there are too many variations in where they might be, etc) but it might be plausible to allow a package to declare its external dependencies somehow, maybe just as free text, and then if the package build fails, pip could display that information. Maybe have a build_hint metadata item that pip could display if a build fails. That could be reasonably easy with Metadata 2.0, which allows for metadata extensions. The trick is to get packages to include such a thing. It would be a shame to add the mechanism and never have it be used. Paul
On Sep 11, 2014, at 3:10 AM, Paul Moore
wrote: On 9 September 2014 16:59, Collin Anderson
wrote: Is there any way packages could list external dependencies? Maybe something like requires = ['Python.h', 'somethingelse.h']. I realize there's not much that pip can do about it automatically, but maybe pip could try to figure out if you have those files around and warn if it doesn't see them.
I don't think pip could realistically check if such files exist (there are too many variations in where they might be, etc) but it might be plausible to allow a package to declare its external dependencies somehow, maybe just as free text, and then if the package build fails, pip could display that information.
Maybe have a build_hint metadata item that pip could display if a build fails. That could be reasonably easy with Metadata 2.0, which allows for metadata extensions.
The trick is to get packages to include such a thing. It would be a shame to add the mechanism and never have it be used.
Paul _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
I think most (all?) of the Linux distros have some method of saying “Tell me what packages provide this file”. Perhaps this code wouldn’t live in pip itself but it could have an extension point which people could provide pip-apt or whatever. That of course doesn’t help Windows or OS X, so it might not be worth it (though I imagine someone could build up such a DB there too and just map them to .exe’s or project home pages or something). Perhaps the gains wouldn’t be worth the complexity though and it’d just be easier to allow projects to have a build hint thing that gets printed if the build fails. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 11 September 2014 08:15, Donald Stufft
Perhaps the gains wouldn’t be worth the complexity though and it’d just be easier to allow projects to have a build hint thing that gets printed if the build fails.
Putting it in core pip sounds to me like a recipe for endless system-specific hacks, TBH. Having a plugin system that allowed external packages to add (and maintain!) system-specific checks might work, but that's pretty complex. Paul
On Sep 11, 2014, at 4:37 AM, Paul Moore
wrote: On 11 September 2014 08:15, Donald Stufft
wrote: Perhaps the gains wouldn’t be worth the complexity though and it’d just be easier to allow projects to have a build hint thing that gets printed if the build fails.
Putting it in core pip sounds to me like a recipe for endless system-specific hacks, TBH. Having a plugin system that allowed external packages to add (and maintain!) system-specific checks might work, but that's pretty complex.
Paul
Yes to be specific the only thing I would personally be OK with adding to the pip core is something that added the appropiate hooks to let some other thing provide the platform specific mechanisms. I'm still not sure it's worth the effort over the simpler idea of just providing a build_hint metadata that authors can use to say "Hey you need to isntall libxml2 for this thing" or whatever. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 11 September 2014 18:48, Donald Stufft
On Sep 11, 2014, at 4:37 AM, Paul Moore
wrote: On 11 September 2014 08:15, Donald Stufft
wrote: Perhaps the gains wouldn’t be worth the complexity though and it’d just be easier to allow projects to have a build hint thing that gets printed if the build fails.
Putting it in core pip sounds to me like a recipe for endless system-specific hacks, TBH. Having a plugin system that allowed external packages to add (and maintain!) system-specific checks might work, but that's pretty complex.
Paul
Yes to be specific the only thing I would personally be OK with adding to the pip core is something that added the appropiate hooks to let some other thing provide the platform specific mechanisms. I'm still not sure it's worth the effort over the simpler idea of just providing a build_hint metadata that authors can use to say "Hey you need to isntall libxml2 for this thing" or whatever.
It actually occurs to me that the GNU autoconf directory scheme may be useful here - if people define their dependencies in terms of that scheme, it should be possible for a plugin to figure out how to ask the OS installer for them, or otherwise check for them in a virtualenv or conda environment. And, if no such plugin is available, the fallback option would be to just tell the user what's missing. (FWIW, the only major barrier I see to formalising the metadata 2.0 spec at this point is the lack of up to date jsonschema files. Getting that out the door may be something to explore post pip 1.6) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Sep 11, 2014, at 8:18 AM, Nick Coghlan
wrote: On 11 September 2014 18:48, Donald Stufft
mailto:donald@stufft.io> wrote: On Sep 11, 2014, at 4:37 AM, Paul Moore
wrote: On 11 September 2014 08:15, Donald Stufft
wrote: Perhaps the gains wouldn’t be worth the complexity though and it’d just be easier to allow projects to have a build hint thing that gets printed if the build fails.
Putting it in core pip sounds to me like a recipe for endless system-specific hacks, TBH. Having a plugin system that allowed external packages to add (and maintain!) system-specific checks might work, but that's pretty complex.
Paul
Yes to be specific the only thing I would personally be OK with adding to the pip core is something that added the appropiate hooks to let some other thing provide the platform specific mechanisms. I'm still not sure it's worth the effort over the simpler idea of just providing a build_hint metadata that authors can use to say "Hey you need to isntall libxml2 for this thing" or whatever.
It actually occurs to me that the GNU autoconf directory scheme may be useful here - if people define their dependencies in terms of that scheme, it should be possible for a plugin to figure out how to ask the OS installer for them, or otherwise check for them in a virtualenv or conda environment.
And, if no such plugin is available, the fallback option would be to just tell the user what's missing.
(FWIW, the only major barrier I see to formalising the metadata 2.0 spec at this point is the lack of up to date jsonschema files. Getting that out the door may be something to explore post pip 1.6)
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com mailto:ncoghlan@gmail.com | Brisbane, Australia
I’d like to take a close look at Metadata 2.0 and see about doing some proof of concept implementations before we actually accept that PEP. Basically the same thing I did for PEP 440. I think the feedback from actually attempting to use it was invaluable. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 11 September 2014 22:20, Donald Stufft
I’d like to take a close look at Metadata 2.0 and see about doing some proof of concept implementations before we actually accept that PEP. Basically the same thing I did for PEP 440. I think the feedback from actually attempting to use it was invaluable.
Absolutely! As it turns out, I was wrong anyway. Checking the issue list at https://bitbucket.org/pypa/pypi-metadata-formats/issues?status=new&status=open&component=Metadata%202.x shows there is still at least the recommendation to use SPDX tags that I'd like to add to PEP 459. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
----- Original Message -----
On Sep 11, 2014, at 4:37 AM, Paul Moore < p.f.moore@gmail.com > wrote:
On 11 September 2014 08:15, Donald Stufft < donald@stufft.io > wrote:
Perhaps the gains wouldn’t be worth the complexity though and it’d
just be easier to allow projects to have a build hint thing that gets
printed if the build fails.
Putting it in core pip sounds to me like a recipe for endless
system-specific hacks, TBH. Having a plugin system that allowed
external packages to add (and maintain!) system-specific checks might
work, but that's pretty complex.
Paul
Yes to be specific the only thing I would personally be OK with adding to the pip core is something that added the appropiate hooks to let some other thing provide the platform specific mechanisms. I'm still not sure it's worth the effort over the simpler idea of just providing a build_hint metadata that authors can use to say "Hey you need to isntall libxml2 for this thing" or whatever.
While working on packaging Ruby and Rubygems for Fedora, we actually used a Rubygems hook to create a plugin that did precisely this, it's called gem-nice-install [1] (we did use it for some time, but I'm not sure whether it's still being actively developed and used) We actually went a step further and implemented the *actual installation* in that plugin (sending list of packages to install to PackageKit via dbus) and it worked really nice. I think allowing plugins is much better than providing build hint, because build hint may not give you information about *what* (different distros name some packages differently), but most importantly *how* the missing packages should be installed. And IMO upstreams shouldn't care about this (they shouldn't *need* to care), this is the work of distro packagers. So I vote for plugins. Thanks, Slavek
--- Donald Stufft
Yes to be specific the only thing I would personally be OK with adding to the pip core is something that added the appropiate hooks to let some other thing provide the platform specific mechanisms. I'm still not sure it's worth the effort over the simpler idea of just providing a build_hint metadata that authors can use to say "Hey you need to isntall libxml2 for this thing" or whatever.
I'd like to see us use PEP459 extensions to hold distro-specific dependencies. (as mentioned here: https://bitbucket.org/pypa/pypi-metadata-formats/issue/16/external-requireme... ) I'm imagining a cross-distro community-maintained project that holds all the json dependency data for all of pypi. and then tooling could build up around that?
participants (6)
-
Bohuslav Kabrda
-
Collin Anderson
-
Donald Stufft
-
Marcus Smith
-
Nick Coghlan
-
Paul Moore