[Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

Chris Barker chris.barker at noaa.gov
Mon May 18 00:50:30 CEST 2015

Trying to keep this brief, because the odds of my finding time to do much
with this are slim..

> I'm not proposing that we drop it -- just that we push pip and wheel a
bit farther to broaden the supported user-base.

> I can't stop you working on something I consider a deep rabbithole,
no -- but I do appreciate your assessment of how deep that hole is -- you
certainly have a while lot more background with all this than I do -- I
could well be being very naive here.

> but why not just recommend the use of conda, and only pubish sdists on
> PyPI? conda needs more users and contributors seeking better integration
> with the PyPA tooling, and minimising the non-productive competition.
I essentially where two hats here:

1) I produce software built on top of the scientific python stack, and I
want my users to have an easy experience with installing and running my
code. For that -- I am going the conda route. I"m not there yet, but close
to being able to say:

a) install Anaconda
b) add my binstar channel to your conda environment
c) conda install my_package

The complication here is that we also have a web front end for our
computational code, and it makes heavy use of all sorts of web-oriented
packages that are not supported by Anaconda or, for the most part, the
conda community (binstar). My solution is to make conda packages myself of
those and put them in my binstar channel. The other option is to piip
install those packages, but then you get pretty tangled up in dependencies
ans conda environments, vs viirtual environments, etc...

2) Hat two is an instructor for the University of Washington Continuing
Education Program's Python Certification. In that program, we do very
little with the scipy stack, but have an entire course on web development.
And the instructor of that class, quite rightly, pushes the standard of
practice for web developers: heavy use of virtualenv and pip.

Oh, and hat (3) is a  long time pythonista, who, among other things, has
been working for years to make using python easier to use on the Mac for
folks that don't know or care what the unix command line is....

I guess the key thing here for me is that I don't see pushing conda to
budding web developers -- but what if web developers have the need for a
bit of the scipy stack? or???

We really don't have a good solution for those folks.

> The web development folks targeting Linux will generally be in a position
> to build from source (caching the resulting wheel file, or perhaps an
> entire container image).
again, I'm not concerned about linux -- it's an ABI nightmare, so we really
don't want to go there, and its users are generally more "sophisticated" a
little building is not a big deal.

> It's also worth noting that one of my key intended use cases for metadata
> extensions is to publish platform specific external dependencies in the
> upstream project metadata, which would get us one step closer to fully
> automated repackaging into policy compliant redistributor packages.

Honestly, I don't follow this! -- but I'll keep an eye out for it - sounds

> The existence of tight ABI coupling between components both gives the
> scientific Python stack a lot of its power, *and* makes it almost as hard
> to distribute in binary form as native GUI applications.

I think harder, actually  :-)

> * No one else seems to think it's worth trying to extend the PyPa
ecosystem a bit more to better support dynamic libs. (except _maybe_
> I know Donald is keen to see this, and a lot of ideas become more feasible
> if(/when?) PyPI gets an integrated wheel build farm. At that point, we can
> use the "centre of gravity" approach by letting the build farm implicitly
> determine the "standard" version of particular external dependencies, even
> if we can't communicate those versions effectively in the metadata.
That's more what I'm thinking, yes.

> * I still think it can be done with minimal changes, and hacked in to do
the proof of concept

I'm still not clear on what "it" is. I've been pointing out how hard it is
to do this right in the general case, but I get the impression you're
actually more interested in the narrower case of defining a "SciPy ABI"
that encompasses selected third party binary dependencies.

I wouldn't say SciPyABI -- that, in a way is already being done -- folks
are coordinating the "official" bianries of at least the core "scipy stack"
-- it's a pain -- no Windows wheels for numpy, for instance (though I think
they are close)
My interest is actually taking it beyond that -- honestly in my case there
are only a handful of libs that I'm aware of that get common use, for
instance libfreetype and libpng in wxPython, PIL, matplotlib, etc.

If I were only SciPy focused -- conda would be the way to go. That's part
ofteh problem I see -- there are split communities, but they DO overlap, I
thin ti's a diservice to punt thes issues of to individual sub-communities
to address on their own.

> * But I'm not sure it's something that's going to get to the top of my
ToDo list anyway -- I can get my needs met with conda anyway. My real
production work is deep in the SciPy stack.

> * So I may or may not move my ideas forward -- if I do, I'll be back with
questions and maybe a more concrete proposal some day....

> If I'm correct that your underlying notion is "It would be nice if there
> was an agreed SciPy ABI we could all build wheels against, and a way of
> publishing the external dependency manifest in the wheel metadata so it
> could be checked for consistency at installation time", that's a far more
> tractable problem than the "arbitrary binary dependencies" one.
What I'm talking about it in-between -- not just SciPy, but not "arbitrary
binary dependencies" either. But i think the trick is that the dependency
really is binary: i.e:

wheelA depends on this particular wheel -- not this version of a lib, but
this BUILD of a lib.

and what I envision is that "this build of a lib" would be, for instance
,libnnnx.ym "built to be compatible with the python.org 64 bit Windows
build of python 3.4"

Which is what we need to do now anyway -- if you are going to deliver a
build of PIL (pillow), for instance, you need to deliver the libs it needs,
built to match the python it's built for, whether you statically link or
dump the dll in with the extension.

All I'm suggesting is that we have a way of letting others use that same
lib build -- this would be more a social thing than anything else.

while I expect attempting to solve the former is likely to be more of a
political battle than a technical one :)

yes -- I think this is a social  / political issue as much as anything else.



Christopher Barker, Ph.D.

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/distutils-sig/attachments/20150517/2760b50b/attachment-0001.html>

More information about the Distutils-SIG mailing list