
On 23/02/17 09:11, Glyph Lefkowitz wrote:
Yes, and building these binary artifacts is often harder than some people (cough, alpine, cough) seem to think. But there are better ways to square this circle than restricting yourself to the versions of /python/ libraries that happen to be available in your distro.
I don't disagree. I'm not defending the practice. But it is *clearly* a thing people do, and understanding the reasons is, IMO, important.
Windows, the reason they're not shipping manylinux1 wheels right now has to do with the political implications of auto-shipping a second copy of openssl to Linux distros that expect to manage security upgrades centrally).
Understandable. I'm sure anyone who was around at the time remembers the zlib fiasco.
It might seem weird to use Python-specific tooling and per-application vendoring for Python dependencies, and yet use distro-global dynamic linking for C dependencies. But, this is actually a perfectly cromulent strategy, and I think this bears a more in-depth explanation.
It's not at all weird. It's exactly what we do, and I agree with your rationale.