
On Feb 23, 2017, at 4:12 AM, Phil Mayers <p.mayers@imperial.ac.uk> wrote:
On 23/02/17 09:11, Glyph Lefkowitz wrote:
Yes, and building these binary artifacts is often harder than some people (cough, alpine, cough) seem to think. But there are better ways to square this circle than restricting yourself to the versions of /python/ libraries that happen to be available in your distro.
I don't disagree. I'm not defending the practice. But it is *clearly* a thing people do, and understanding the reasons is, IMO, important.
Oh, absolutely. What I'm driving at here - as I said perhaps more clearly in the other message I just sent - is that there are really good reasons to use distro infrastructure (C build toolchain complexity in large, critical dependencies) and not-so-good reasons (there just happen to be preinstalled python packages you could, easily, with zero build configuration, pip install) and the way that we give guidance to those folks should be informed by that.
Windows, the reason they're not shipping manylinux1 wheels right now has to do with the political implications of auto-shipping a second copy of openssl to Linux distros that expect to manage security upgrades centrally).
Understandable. I'm sure anyone who was around at the time remembers the zlib fiasco.
I remember enough zlib fiascos that I am not even sure which one you're referring to :).
It might seem weird to use Python-specific tooling and per-application vendoring for Python dependencies, and yet use distro-global dynamic linking for C dependencies. But, this is actually a perfectly cromulent strategy, and I think this bears a more in-depth explanation.
It's not at all weird. It's exactly what we do, and I agree with your rationale.
Cool. Glad to hear that this is being invented in parallel in various places :).