On Thu, Aug 13, 2015 at 10:52 AM, David Cournapeau <cournape@gmail.com> wrote:
On Thu, Aug 13, 2015 at 2:05 AM, Nathaniel Smith <njs@pobox.com> wrote:
On Aug 12, 2015 13:57, "Nate Coraor" <nate@bx.psu.edu> wrote:
Hello all,
I've implemented the wheel side of Nick's suggestion from very early in this thread to support a vendor-providable binary-compatibility.cfg.
https://bitbucket.org/pypa/wheel/pull-request/54/
If this is acceptable, I'll add support for it to the pip side. What else should be implemented at this stage to get the PR accepted?
From my reading of what the Enthought and Continuum folks were saying about how they are successfully distributing binaries across different distributions, it sounds like the additional piece that would take this from a interesting experiment to basically-immediately-usable would be to teach pip that if no binary-compatibility.cfg is provided, then it should assume by default that the compatible systems whose wheels should be installed are: (1) the current system's exact tag, (2) the special hard-coded tag "centos5". (That's what everyone actually uses in practice, right?)
To make this *really* slick, it would be cool if, say, David C. could make a formal list of exactly which system libraries are important to depend on (xlib, etc.), and we could hard-code two compatibility profiles "centos5-minimal" (= just glibc and the C++ runtime) and "centos5" (= that plus the core too-hard-to-ship libraries), and possibly teach pip how to check whether that hard-coded core set is available.
So this is a basic list I got w/ a few minutes of scripting, by installing our 200 most used packages on centos 5, ldd'ing all of the .so, and filtering out a few things/bugs of some of our own packages):
/usr/lib64/libatk-1.0.so.0 /usr/lib64/libcairo.so.2 /usr/lib64/libdrm.so.2 /usr/lib64/libfontconfig.so.1 /usr/lib64/libGL.so.1 /usr/lib64/libGLU.so.1 /usr/lib64/libstdc++.so.6 /usr/lib64/libX11.so.6 /usr/lib64/libXau.so.6 /usr/lib64/libXcursor.so.1 /usr/lib64/libXdmcp.so.6 /usr/lib64/libXext.so.6 /usr/lib64/libXfixes.so.3 /usr/lib64/libXft.so.2 /usr/lib64/libXinerama.so.1 /usr/lib64/libXi.so.6 /usr/lib64/libXrandr.so.2 /usr/lib64/libXrender.so.1 /usr/lib64/libXt.so.6 /usr/lib64/libXv.so.1 /usr/lib64/libXxf86vm.so.1 /usr/lib64/libz.so.1
This list should only be taken as a first idea, I can work on a more precise list including the versions if that's deemed useful.
Cool. Here's a list of the external .so's assumed by the packages currently included in a default Anaconda install: https://gist.github.com/njsmith/6c3d3f2dbaaf526a8585 The lists look fairly similar overall -- glibc, libstdc++, Xlib. They additionally assume the availability of expat, glib, ncurses, pcre, maybe some other stuff I missed, but they ship their own versions of libz and fontconfig, and they don't seem to either ship or use cairo or atk in their default install. For defining a "standard platform", just taking the union seems reasonable -- if either project has gotten away this long with assuming some library is there, then it's probably there. Writing a little script that takes a wheel and checks whether it has any external dependencies outside of these lists, or takes a system and checks whether all these libraries are available, seems like it would be pretty trivial.
One significant issue is SSL: in theory, we (as a downstream distributor) really want to avoid distributing such a key piece of infrastructure, but in practice, there are so many versions which are incompatible across distributions that it is not an option.
This is mostly an issue for distributing Python itself, right? ...I hope? -n -- Nathaniel J. Smith -- http://vorpus.org