[Distutils] Beyond wheels 1.0: helping downstream, FHS and more

Daniel Holth dholth at gmail.com
Mon Apr 13 21:19:27 CEST 2015


On Mon, Apr 13, 2015 at 12:56 PM, Chris Barker <chris.barker at noaa.gov> wrote:
> NOTE: I don't work for any of the companies involved -- just a somewhat
> frustrated user... And someone that has been trying for years to make things
> easier for OS-X users.
>
>>> I’m not sure what (3) means exactly. What is a “normal” Python, do you
>>> modify Python in a way that breaks the ABI but which isn’t reflected in the
>>> standard ABI tag?
>>
>>
>> It could be multiple things. The most obvious one is that generally.
>> cross-platforms python distributions will try to be "relocatable" (i.e. the
>> whole installation can be moved and still work). This means they require
>> python itself to be built a special way. Strictly speaking, it is not an ABI
>> issue, but the result is the same though: you can't use libraries from
>> anaconda or canopy on top of a normal python
>
>
> But why not? -- at least for Anaconda, it's because those libraries likely
> have non-python dependencies, which are expected to be installed in a
> particular way. And really, this is not particular to Anaconda/Canopy at
> all. Python itself has no answer for this issue, and eggs and wheels don't
> help. Well, maybe kinda sorta they do, but in a clunky/ugly way: in order to
> build a binary wheel with non-python dependencies (let's say something like
> libjpeg, for instance), you need to either:
>  - assume that libjpeg is installed in a "standard" place -- really no
> solution at all (at least outside of linux)
>  - statically link it
>  - ship the dynamic lib with the package
>
> For the most part, the accepted solution for OS-X has been to statically
> link, but:
>
>  - it's a pain to do. The gnu toolchain really likes to use dynamic linking,
> and building a static lib that will run on a
> maybe-older-than-the-build-system machine is pretty tricky.
>
>  - now we end up with multiple copies of the same lib in the python install.
> There are a handful of libs that are used a LOT. Maybe there is no real
> downside -- disk space and memory are cheap these days, but it sure feels
> ugly. And I have yet to feel comfortable with having multiple versions of
> the same lib linked into one python instance -- I can't say I've seen a
> problem, but it makes me nervous.
>
> On Windows, the choices are the same, except that: It is so much harder to
> build many of the "standard" open source libs that package authors are more
> likely to do it for folks, and you do get the occasional "dll hell" issues.
>
> I had a plan to make some binary wheels for OS-X that were not really python
> packages, but actually just bundled up libs, so that other wheels could
> depend on them. OS-X does allow linking to relative paths, so this should
> have been doable, but I never got anyone else to agree this was a good idea,
> and I never found the roundtoits anyway. And it doesn't really fit into the
> PyPi, pip, wheel, etc. philosphy to have dependencies that are platform
> dependent and even worse, build-dependent.
>
> Meanwhile, conda was chugging along and getting a lot of momentum in the
> Scientific community. And the core thing here is that conda was designed
> from the ground up to support essentially anything, This means is supports
> python packages that depend on non-python packages, but also supports
> packages that have nothing to do with python (Perl, command line tools, what
> have you...)
>
> So I have been focusing on conda lately.
>
> Which brings me back to the question: should the python tools (i.e. wheel)
> be extended to support more use-cases, specifically non-python dependencies?
> Or do we just figure that that's a problem better solved by projects with a
> larger scope (i.e. rpm, deb, conda, canopy).
>
> I'm on the fence here. I mostly care about Python, and I think we're pretty
> darn close with allowing wheel to support the non-python dependencies, which
> would allow us all to "simply pip install" pretty much anything -- that
> would be cool. But maybe it's a bit of a slippery slope, and if we go there,
> we'll end up re-writing conda.
>
> BTW, while you can't generally install a conda package in/for another
> python, you can generally install a wheel in a conda python....There are a
> few issues with pip/setuptools trying to resolve dependencies while not
> knowing about conda packages, but it does mostly work.
>
> Not sure that helped the discussion -- but I've been wrestling with this for
> a while, so thought I'd get my thoughts out there.

I've always thought of wheel as solving only the Python-specific
problem. Providing relocatable Python-specific packaging without
trying to solve the intractable problem of non-Python dependencies.
The strategy works best the more you are targeting "Python" as your
platform and not a specific OS or distribution - sometimes it works
well, other times not at all.

Obviously if you need a specific build of PostgreSQL wheel isn't going
to help you. With enough hacks you could make it work but are we ready
to "pip install kde"? I don't think so. Personally I'm happy to let
other tools solve the problem of C-level virtualenv.

It's been suggested you could have a whole section in your Python
package that said "by the way, RedHat package x, or Debian package y,
or Gentoo package z", or use a separate package equivalency mapping as
a level of indirection. I don't think this would be very good either.

Instead, if you are doing system-level stuff, you should just use a
system-level or user-level packaging tool that can easily re-package
Python packages such as conda, rpm, deb, etc.


More information about the Distutils-SIG mailing list