[Distutils] build system abstraction PEP

Robert Collins robertc at robertcollins.net
Wed Oct 28 02:34:11 EDT 2015

On 28 October 2015 at 17:50, Marcus Smith <qwcode at gmail.com> wrote:
>> Current draft text in rendered form at:
>> https://gist.github.com/rbtcollins/666c12aec869237f7cf7
> Thanks for working on this.
> Overall I like the idea, but have some comments/questions
> 1)  *Please*, *please*, *please* let's start doing PEP conversations as PRs
> to pypa/interoperability-peps :  )  There's a place in there for unnumbered
> PEPs.  Review will be better and faster IMO as PRs.  If the PR gets too
> cluttered with old conversations, just start clean with a new PR.   We can
> announce the PRs here, but don't discuss them on the mailing list.   The
> same goes for trimming down PEP426 as you mention.   Let's do that as a PR
> to pypa/interoperability-peps to the existing document.


> 2) Ditto on Ralf's concerns about readability.  I only really understood it
> after seeing the examples you gave to Ralf (they need to be in the PEP, not
> just in response to Ralf).   There's a few places I'd want to alter the
> wording, but I likely won't bother, unless it's done as a PR.

I'll work on an iteration for that, the pr is just what I had in the gist.

> 3) You refer to "source distribution".  What does that mean exactly in this
> PEP?  just the current setuptools notion?

yes. I don't change the definition of source distribution at all in this PEP.

> 4) Although using a process interface is not necessarily a problem, I don't
> agree with your point on why a python interface would be unworkable.  You're
> assuming that pip would try to import all the build tools (for every
> dependency it's processing) in the master process.  An alternative could be
> that pip would have it's own tool (that runs as a subprocess in an isolated
> env) that knows how to load and work with python build interfaces.   You
> could argue that a python api is  an advantage, because build tools aren't

That would mean that pip has to have the same exact version of it embedded
in the environment that build tools will be run in, which it doesn't have today.
I worry that that will be hard to manage, and hard for folk to debug.

> forced to grow a certain cli for pip, they just have to add a shim module
> that conforms to the interface.   But my point is not to argue for the
> Python API, but for you to remove an argument that from what I can tell,
> isn't really an argument for one or the other.

I'll review the text I have there, but the argument does seem valid to me.

> 5) at one pt you said "--dump-build-requires would just output a constant
> string: {"schema_version": "2.0", "build_requires": []}"   you meant
> "metadata_version", right?

Yes, sorry.

> 6)  Can you explain better why we need to manage a pypa.yaml schema version
> *and* a build description schema version.   Why not just a version for
> pypa.yaml, and if anything changes (in the yaml schema or the build
> "description" api),  then just bump the version.

I see no problem with evolving them in lockstep, but because the documents are
separate, we either have to infer the version of one, or we have to be
explicit on each.
I'd rather avoid the chance for a bug where something tries to parse a
v2 schema build
description with a v1 schema parser. Making them self identifying avoids that.

> 7) it's unclear when pip get's to run "dist-info" and when the result might
> be different.   For example, we've discussed that run time dependencies may
> get clarifed *after* the build process.... so this command might produce
> different results at different times?

Pip would run dist-info when determining the install-requires and
extras for the package. That would be required to generate the actual
final dependencies. It would only be run once I expect. I'll clarify
that expected behaviour in the PEP.


Robert Collins <rbtcollins at hp.com>
Distinguished Technologist
HP Converged Cloud

More information about the Distutils-SIG mailing list