[Pythonmac-SIG] Advice wanted on dependency building...

Ronald Oussoren ronaldoussoren at mac.com
Thu May 23 08:53:14 CEST 2013


On 23 May, 2013, at 0:46, Chris Barker - NOAA Federal <chris.barker at noaa.gov> wrote:

> Thanks Ronald,
> 
> On Wed, May 22, 2013 at 2:53 PM, Ronald Oussoren <ronaldoussoren at mac.com> wrote:
> 
>> To move back onto topic, not relying on unix-level libraries in OSX is in a good thing as it makes it easier to support multiple OSX versions with a single set of binaries.
> 
> hmm -- I figured if it was a system lib, it should work on whatever
> system It's running on. For example, I'm working right now on the
> netcdf4 lib -- it required hdr5, which requires zlib. I"m using the
> system zlib -- is that a bad idea? Should I build it too, to make sure
> it matches the rest of it?
> 
> (I do want the binaries to run anywhere the binary Python I'm using runs)

It depends on the library. Zlib should be fine, that library doesn't change very often and mostly in backward compatible ways. An example of a problematic library is OpenSSL, that doesn't a stable ABI and hence there are now two copies of libcrypto.dylib on OSX 10.8 and the one you'll link to by default is not available on all older versions of OSX (not sure how recently that was changed, but that's not important).

> 
> 
>> Except for a number of more complicated libraries (such as PIL/Pillow) when using universal binaries (when using 'pip install', homebrew/macports/... have their own mechanisms for building).
> 
> right -- Universal libs are not well supported by those systems -- but
> that's the power users problem!

I agree w.r.t. homebrew and macports, but it would be nice if 'pip install' would work with your system with minimal changes to the pip configuration (e.g. "just add ... to your piprc and then 'pip install foo' will install a binary from the repo instead of building the binaries itself").

> 
>>> 2) folks that want to use a Mac like a Mac, and people that develop
>>> for those folks --  these people need binary installers, and may want
>>> to be able to use and deploy either packages or applications (Py2app)
>>> that will run on systems older than the one developed on, or want
>>> universal builds, or ???
>>> - These are the folks I'd like to support, but I'm still unsure as
>>> to how best to do that.
>> 
>> It would be nice to have a set of binary "packages", based on a reproducable build system.
> 
> Exactly what I'd like to build!
> 
>>> Way back when Bob Ippolito maintained a repository of binary packages
>>> for the mac -- it was a great resource, but he's long since moved on
>>> to other things.
>> 
>> The binary packages that Bob maintained had IMHO two major problems:
>> 
>> 1) The largest problem is that the packages were AFAIK created ad-hoc (Bob or some other contributor did the magic incantations to build library dependencies)
> 
> Yeah, and he never gave anyone else permission to push to it...

I wouldn't have done that either until the someone else has a proven trackrecord (both in providing usable binaries and in being known in the community). 

> 
>> 2) The packages were Installer.app packages. The current s
> 
>> The header is easily updated using macholib, but that makes installation
>> harder and isn't supported by the standard packaging tools (easy_install
>> and pip)
> 
> But if we put the shared libs in amore central location, then all your
> virtual-ens could use the same ones, yes?

Yes. It would make it harder to switch library versions, but not by much.

> 
>> 2) The primary use case for dynamic linking is to share dylibs between extensions, and when those extensions are in different PyPI packages the packaging story gets more complicated. The easiest workaround is to ignore sharing dylibs and still bundle multipe copies of libpng if two different PyPI packages both link with libpng.
> 
> when you say bundle, do you mean static link? Or just package up the
> dylib with the bundle, which is what i was thinking -- each package
> installs the libs it needs, which may or may not already have been
> installed by another package -- but so what?

Uninstall can be a problem with that, you'd have to refcount installed files to ensure that libraries are only removed when the last user is uninstalled. I don't know if the installation format used by pip supports having two packages that install the same file.

This can be worked around with fake PyPI packages that only install the shared libraries and have the real packages depend on that (that is a "macbins-libpng" package with libpng.dylib and have the Imaging package depend on that).

> And I expect the number of folks building packages will be fairly
> small, so one builder would one have to build one set of dylibs.
> 
>>> But if dynamic, where do you put them? We'll still want to ship them
>> A new framework isn't necessary. There are three locations that could easily be used:
>> 
>> 1) A directory in Python.framework, for example /Library/Frameworks/Python.framework/Frameworks
> 
> That makes sense to me.
> 
>> 2) A directory in /Library/Python, for example /Library/Python/Externals
> 
> that feels a bit lke Apple's turf, but what do I know?

/Library can be used, we'd just have to pick a name that Apple is unlikely to use.
> 
>> 3) As 2), but in the users home directory (~/Library/Python/Externals)
>> The latter is the only one where you can install without admin privileges.
> 
> But we put the python binaries  in /LIbrary/Frameworks -- it seems we
> should do the same with libs...

I'm probably atypical, but my main account doesn't have admin privileges. It would suck if I'd have to use sudo to install.

The @loader_path option you mentioned in a followup e-mail could help there. That way the shared libraries can be installed in a fixed location relative to sys.prefix, while still supporting virtualenvs. You wouldn't be able to share shared libraries between python versions or virtualenvs, but that's not really a problem (disk space is cheap).

> 
> 
>> The folks over on distutils-sig are working towards support for wheels (PEP 427, <http://www.python.org/dev/peps/pep-0427/>) at least in pip and distribute/setuptools and possibly in the stdlib as well (for 3.4). It would be nice if the OSX package collection would be in wheel format, that would make it relatively easy to install the packages using the defacto standard tools.
> 
> Any idea what the time scale is on this?

Before Python 3.4 is out, which means sometime this summer. The code is mostly there (see <http://wheel.readthedocs.org/en/latest/>), and there's also distlib (<https://distlib.readthedocs.org/en/latest/>) with distil (<https://pythonhosted.org/distil/overview.html>) as an example pip-like tool. IIRC distlib was intended to go in the stdlib for 3.4, after it became clear that the packaging/distutils2 effort wouldn't be ready anytime soon. I don't know if distlib is still targetting the stdlib, but the pip folks are looking into using it (which was an explicit goal: distlib contains the shared functionality that any packaging tool for Python needs to have and that is backed by a specification (PEP)).

> 
>> What I haven't looked into yet is how easy it would be to configure pip to look for packages on PyPI and then look for binaries (wheels) in some other location.
> 
> Have the pip folks made any commitment at all to supporting binary
> installs? That's a big missing feature.

Yes, through wheels. The development branch in pips repo (<https://github.com/pypa/pip/tree/develop/pip>) contains support for wheels (both creating and installing), although AFAIK installation of pips requires a command-line argument at the moment because wheel support is experimental at this point.

> 
>>> Note that I've used the term "we" here ;-)  I'm hoping that others
>>> will join me in following a convention and getting stuff out there,
>>> but even if not, I'd love feedback on how best to do it.
>> 
>> Good luck :-).  This is fairly boring low-level packaging work, and that tends to put off people. At least it is a lot easier than trying to fix or replace distutils, that has burned out at least 3 generations of developers ;-/
> 
> Well, I'm an optimist -- and recently at least you and Ned and Russel
> Owen have bee known to contribute.

I'll provide mental support at the least, and hope to do more than that but don't know if I can do that time-wise.

> 
>>> By the way, the other goal is to build scripts that do the builds the
>>> way we need for various libs, packages, etc, so that it's easy to do
>>> it all when new builds are required...
>>> (maybe use gattai? -- http://sourceforge.net/projects/gattai/)
>> 
>> I haven't used gattai before, but at least it is Python code. Building tools for this from scratch would also not be very hard, I already done so a number of times (such as the build-installer.py script in the CPython repository).
> 
> yup -- me too, though I find myself wanting to add various make-like
> features, and it dawns on me that I am re-inventing the wheel! So I
> want to give Gattai a shot. Also Kevin is likely to be helpful.
> 
>> The hard part should be setting up the build infrastructure and build scripts, once couple of relatively hard packages like PIL or wx have been processed adding more packages and new versions of packages should be easy.
> 
> Exactly. But I don't know about wx -- that is a bear, and Robin's been
> doing a fine job!

If wx is hard to package it would be a good stress test of the tools, even if you'd end up not distributing the binaries :-)

Ronald


More information about the Pythonmac-SIG mailing list