[Distutils] PEP 439 and pip bootstrap updated

Vinay Sajip vinay_sajip at yahoo.co.uk
Sat Jul 13 14:12:01 CEST 2013






> From: Donald Stufft <donald at stufft.io>


>As I said in my email, because it's more or less standalone and it has the
>greatest utility outside of installers/builders/archivers/indexes.


Even if that were true, it doesn't mean that it's the *only* thing that's worth considering.

>I've looked at many other languages where they had widely successful
>packaging tools that weren't added to the standard lib until they were
>ubiquitous and stable. Something the new tools for Python are not. So I
>don't think adding it to the standard library is required.


As I said earlier, I'm not arguing for *premature* inclusion of distlib or anything else in the stdlib. I'm only saying that there's less likelihood that any one approach outside the stdlib will get univerally adopted, leading to balkanisation.

>to reuse some it's functionality. So pointing towards setuptools just exposes
>the fact that improving it in the standard library was hard enough that it was
>done externally.


It seems like it wasn't for technical reasons that this approach was taken, just as Distribute wasn't forked from setuptools for technical reasons.


>Well I am of the mind that the standard library is where software goes to die, and


No kidding? :-)

>want to use my software at all. A huge thing i've been trying to push for is decoupling
>packaging from a specific implementation so that we have a "protocol" (ala HTTP)
>and not a "tool" (ala distutils). However the allure of working to the implementation
>and not the standard is fairly high when there is a singular blessed implementation.


I'm not aware of this - have you published any protocols around the work you're doing on warehouse, which Nick said was going to be the next-generation PyPI?

>It's funny you picked and example where improvements *couldn't* take place and
>the entire system had to be thrown out and a new one written. getopt had to become a
>new module named opt parse, which had to become a new module named argparse


I picked that example specifically to show that even if things go wrong, it's not the end of the world.

>You can gain interoperability in a few ways. One way is to just pick an implementation


If that were done, it wouldn't make any difference whether the thing picked were in the stdlib or not. But people have a tendency to roll their own stuff, whether there's a good technical reason or not.

>and make that the standard. Another is to define *actual* standards. The second
>one is harder, requires more thought and work. But it means that completely
>different software can work together. It means that something written in Ruby
>can easily work with a python package without shelling out to Python or without


That's exactly why there are all these packaging PEPs around, isn't it?

>And that's fine for a certain class of problems. It's not that useful for something
>where you want interoperability outside of that tool. How terrible would it be if
>HTTP was "well whatever Apache does, that's what HTTP is".


That wouldn't have been so terrible if you replace "Apache" with "W3C", since you would have a reference implementation by the creators of the standard.



>A singular blessed tool in the standard library incentivizes the standard becoming
>and implementation detail. I *want* there to be multiple implementations written by
>different people working on different "slices" of the problem. That incentivizes doing
>the extra work on PEPs and other documents so that we maintain a highly documented
>standard. It's true that adding something to the standard library doesn't rule that out
>but it provides an incentive against properly doing standards because it's easier and
>simpler to just change it in the implementation.


Are you planning to produce any standards relating to PyPI-like functionality? This is important for the dependency resolution "slice", amongst others.

The flip side of this coin is, talking in the abstract without any working code is sub-optimal. It's reasonable for standards and implementations of them to grow together, because each informs the other, at least in the early stages. Most standards PEPs are accepted with a reference implementation in place.


>It's not blessed and a particular packaging project should use it if it fits their
>needs and they want to use it. Or they shouldn't use it if they don't want.
>Standards exist for a reason. So you can have multiple implementations that
>all work together.

That's true independent of whether one particular implementation of the standard is blessed in some way.


>I didn't make any claims as to it's stability or the amount of testing that went into
>it. My ability to be convinced of that stems primarily from the fact that it's sort of
>a side piece of the whole packaging infrastructure and toolchain and it's also
>a piece that is most likely to be useful on it's own.


But the arguments about agility and stability apply to any software - version-handling doesn't get a special pass. Proper version handling is central to dependency resolution and is hardly a side issue, though it's not especially complicated.


I'll just finish by re-iterating that I think there should be some stdlib underpinning for packaging in general, and that there should be some focus on exactly what that underpinning should be, and that I'm by no means saying that distlib is it. I consider distlib as still in its early days but showing some promise (and deserving of more peer review than it has received to date).

Regards,

Vinay Sajip


More information about the Distutils-SIG mailing list