[Distutils] Handling the binary dependency management problem
Chris Barker - NOAA Federal
chris.barker at noaa.gov
Wed Dec 4 17:05:26 CET 2013
Great to have you on this thread!
Note: supporting "variants" on one way or another is a great idea, but for
right now, maybe we can get pretty far without it.
There are options for "serious" scipy users that need optimum performance,
and newbies that want the full stack.
So our primary audience for "default" installs and pypi wheels are folks
that need the core packages ( maybe a web dev that wants some MPL plots)
and need things to "just work" more than anything optimized.
So a lowest common denominator wheel would be very, very, useful.
As for what that would be: the superpack is great, but it's been around a
while (long while in computer years)
How many non-sse machines are there still out there? How many non-sse2? And
how big is the performance boost anyway?
What I'm getting at is that we may well be able to build a reasonable win32
binary wheel that we can put up on pypi right now, with currently available
Then MPL and pandas and I python...
Scipy is trickier-- what with the Fortran and all, but I think we could do
And what's the hold up with win64? Is that fortran and scipy? If so, then
why not do win64 for the rest of the stack?
(I, for one, have been a heavy numpy user since the Numeric days, and I
still hardly use scipy)
By the way, we can/should do OS-X too-- it seems easier in fact (fewer
hardware options to support, and the Mac's universal binaries)
Note on OS-X : how long has it been since Apple shipped a 32 bit machine?
Can we dump default 32 bit support? I'm pretty sure we don't need to do PPC
On Dec 3, 2013, at 11:40 PM, Ralf Gommers <ralf.gommers at gmail.com> wrote:
On Wed, Dec 4, 2013 at 1:54 AM, Donald Stufft <donald at stufft.io> wrote:
> On Dec 3, 2013, at 7:36 PM, Oscar Benjamin <oscar.j.benjamin at gmail.com>
> > On 3 December 2013 21:13, Donald Stufft <donald at stufft.io> wrote:
> >> I think Wheels are the way forward for Python dependencies. Perhaps not
> >> things like fortran. I hope that the scientific community can start
> >> publishing wheels at least in addition too.
> > The Fortran issue is not that complicated. Very few packages are
> > affected by it. It can easily be fixed with some kind of compatibility
> > tag that can be used by the small number of affected packages.
> >> I don't believe that Conda will gain the mindshare that pip has outside
> >> the scientific community so I hope we don't end up with two systems that
> >> can't interoperate.
> > Maybe conda won't gain mindshare outside the scientific community but
> > wheel really needs to gain mindshare *within* the scientific
> > community. The root of all this is numpy. It is the biggest dependency
> > on PyPI, is hard to build well, and has the Fortran ABI issue. It is
> > used by very many people who wouldn't consider themselves part of the
> > "scientific community". For example matplotlib depends on it. The PyPy
> > devs have decided that it's so crucial to the success of PyPy that
> > numpy's basically being rewritten in their stdlib (along with the C
> > API).
> > A few times I've seen Paul Moore refer to numpy as the "litmus test"
> > for wheels. I actually think that it's more important than that. If
> > wheels are going to fly then there *needs* to be wheels for numpy. As
> > long as there isn't a wheel for numpy then there will be lots of
> > people looking for a non-pip/PyPI solution to their needs.
> > One way of getting the scientific community more on board here would
> > be to offer them some tangible advantages. So rather than saying "oh
> > well scientific use is a special case so they should just use conda or
> > something", the message should be "the wheel system provides solutions
> > to many long-standing problems and is even better than conda in (at
> > least) some ways because it cleanly solves the Fortran ABI issue for
> > example".
> > Oscar
> I’d love to get Wheels to the point they are more suitable then they are
> SciPy stuff,
That would indeed be a good step forward. I'm interested to try to help get
to that point for Numpy and Scipy.
I’m not sure what the diff between the current state and what
> they need to be are but if someone spells it out (I’ve only just skimmed
> your last email so perhaps it’s contained in that!) I’ll do the arguing
> for it. I
> just need someone who actually knows what’s needed to advise me :)
To start with, the SSE stuff. Numpy and scipy are distributed as
"superpack" installers for Windows containing three full builds: no SSE,
SSE2 and SSE3. Plus a script that runs at install time to check which
version to use. These are built with ``paver bdist_superpack``, see
https://github.com/numpy/numpy/blob/master/pavement.py#L224. The NSIS and
CPU selector scripts are under tools/win32build/.
How do I package those three builds into wheels and get the right one
installed by ``pip install numpy``?
If this is too difficult at the moment, an easier (but much less important
one) would be to get the result of ``paver bdist_wininst_simple`` as a
For now I think it's OK that the wheels would just target 32-bit Windows
and python.org compatible Pythons (given that that's all we currently
distribute). Once that works we can look at OS X and 64-bit Windows.
Distutils-SIG maillist - Distutils-SIG at python.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Distutils-SIG