[Distutils] moving things forward (was: wheel including files it shouldn't)
ncoghlan at gmail.com
Tue May 10 08:26:35 EDT 2016
On 10 May 2016 at 07:08, Chris Barker <chris.barker at noaa.gov> wrote:
> But I started this whole line of conversation because it seemed that there
> was desire for:
> Ability to isolate the build environment.
> Ability to better handle/manage non-python dependencies
I don't care about the first one - between disposable VMs and Linux
containers, we're already spoiled for choice when it comes to
supporting isolated build environments, and every project still
gaining net new contributors gets a natural test of this whenever
someone tries to set up their own local build environment.
I do care about the second one - Tennessee Leeuwenburg's draft PEP for
that is something he put together at the PyCon Australia sprints, and
it's the cornerstone of eventually being able to publish to PyPI and
have RPMs, Debian packages, conda packages, homebrew packages, etc,
"just happen" without any need for human intervention, even if your
package has external binary dependencies.
The fact that people would potentially be able to do "pip wheel" more
easily (since they'd be presented with a clean "you need <X>, but
don't have it" error message rather than a cryptic build failure) is a
nice bonus, but it's not the reason I personally care about the
feature - I care about making more efficient use of distro packager's
time, by only asking them to do things a computer couldn't be doing
instead. The more complete we're able to make upstream dependency
metadata, the less people will need to manually tweak the output of
pyp2rpm (and similar tools for other platforms).
>> (twice, actually, anchored on
>> /usr/bin/python and /usr/bin/python3), it should eventually be
>> feasible to have the upstream->conda pipeline fully automated as well.
> yeah -- there's been talk for ages of automatically building conda packages
> (on the fly, maybe) from PyPi packages. But currently on conda-forge we've
> decided to NOT try to do that -- it's turned out in practice that enough
> pypi packages end up needing some hand-tweaking to build. So teh planned
> workflow is now:
> Auto-build a conda build script for a PyPi package
> Test it
> Tweak it as required
> Add it to conda-forge.
> Then -- *maybe* write a tool that auto-updates the PyPi based packages in a
> chron job or whatever.
> So not quite a automated conda-PyPi bridge, but not a bad start.
Yep, this is around the same level the Linux distros are generally at
- a distro level config gets generated *once* (perhaps with a tool
like pyp2rpm), but the result of that process then needs to be hand
edited when new upstream releases come out, even if nothing
significant has changed except the application code itself (no new
build dependencies, no new build steps, etc).
Utilities like Fedora's rebase-helper can help automate those updates,
but I still consider the ideal arrangement to be for the upstream
metadata to be of a sufficient standard that post-generation manual
tweaking ceases to be necessary in most cases.
Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia
More information about the Distutils-SIG