The point of this request is that Python's packaging infrastructure is
looking at what compression we use for wheels - the current
compression is suboptimal for huge binaries like tensorflow. Packaging
is in a unique situation, because it *cannot* use external libraries
It's hard to see where packaging would have any advantage with brotli or zstd over lzma. XZ is more widely used, and package size seems to dominate speed. There are definitely some intermediate compression levels where both brotli and zstd are significantly faster, but not at the higher levels where lzma does as well or better.
Is there a concrete need here, or just an abstract point that compression of packages shouldn't be outside the stdlib?
Honestly, if you really want compression size over everything else, PPM is going to beat the LZ based approaches. But being ungodly slow and using tons of memory.
-- The dead increasingly dominate and strangle both the living and the
not-yet born. Vampiric capital and undead corporate persons abuse
the lives and control the thoughts of homo faber. Ideas, once born,
become abortifacients against new conceptions.