People want CPAN :-)

On Nov 7, 2009, at 3:20 AM, Ben Finney wrote:
Guido van Rossum guido@python.org writes:
On Fri, Nov 6, 2009 at 2:52 PM, David Lyon david.lyon@preisshare.net wrote:
I think buildbot-style test runs for PyPI packages would raise average package quality on PyPI.
Please excuse the cross-post but I wanted to make sure that all these "CPAN for Python" discussions got this message and I've lost track of which list which part of what discussion had occurred on.
We are currently extending our distutils/Distribute test system to include installation of a broad range of packages as part of the pre- release process for a future release of Distribute and as part of our "smoke" test for distutils/Distribute. Eventually, the goal is to integrate this with our buildbot system but that's a ways off.
Our goal is to install a range of packages and, where practicable, actually run and record any errors with the packages' individual test suites.
Right now, our "smoke" test only does Twisted and numpy. We've discussed how to collect test results from Twisted trial and we'll be working on similar things for other test runners (nose et al.). For Twisted, we're going to install and test both the current release version and an svn checkout from trunk.
It would be an extension of that concept to install and test *all* packages from PyPI but would, obviously, take considerable horsepower (and time) to run such an exhaustive test (especially if we're talking about 2.4?, 2.5, 2.6, 2.7, and 3.1+.
Right now I'm extending the configuration file for our smoke test to allow for various test runners (e.g. nose, twisted trial, etc.) so we can "smoke out" more installation problems and/or failed tests after installation.
For the first pass, I'm just focusing on Twisted and trial, then numpy, then finding packages that support nose so that I can collect the data on what ran, what passed, and what didn't. I'm planning on collecting this all in a database and making some simple API so that it can be mined by very simple apps later.
At the point where that infrastructure is in place, we could pretty easily mine the data to do all kinds of crazy things people have mentioned like:
* A ranking system of test coverage * Complexity analysis * Test coverage * Run pylint, pyflakes, 2to3, whatever automated measurement tools over the code * Send test failure messages to maintainers (maybe with opt-in in the new meta-data). * Whatever!
We're actively working on this right now; anyone who wants to lend a hand is welcome to contact me off-list and we can talk about what types of things we are needing and where we could use a hand.
All in all, I think this could be a big leap forward for the Python distribution ecosystem whether or not we eventually write the PyPan I wished for as a new Perl refugee.
Thanks,
S

On Sat, Nov 7, 2009 at 9:30 AM, ssteinerX@gmail.com ssteinerx@gmail.com wrote:
On Nov 7, 2009, at 3:20 AM, Ben Finney wrote:
Guido van Rossum guido@python.org writes:
On Fri, Nov 6, 2009 at 2:52 PM, David Lyon david.lyon@preisshare.net wrote:
I think buildbot-style test runs for PyPI packages would raise average package quality on PyPI.
Please excuse the cross-post but I wanted to make sure that all these "CPAN for Python" discussions got this message and I've lost track of which list which part of what discussion had occurred on.
We are currently extending our distutils/Distribute test system to include installation of a broad range of packages as part of the pre-release process for a future release of Distribute and as part of our "smoke" test for distutils/Distribute. Eventually, the goal is to integrate this with our buildbot system but that's a ways off.
Who is "we"?

On Nov 7, 2009, at 10:08 AM, Jesse Noller wrote:
On Sat, Nov 7, 2009 at 9:30 AM, ssteinerX@gmail.com <ssteinerx@gmail.com
wrote: On Nov 7, 2009, at 3:20 AM, Ben Finney wrote:
Guido van Rossum guido@python.org writes:
On Fri, Nov 6, 2009 at 2:52 PM, David Lyon <david.lyon@preisshare.net
wrote:
I think buildbot-style test runs for PyPI packages would raise average package quality on PyPI.
Please excuse the cross-post but I wanted to make sure that all these "CPAN for Python" discussions got this message and I've lost track of which list which part of what discussion had occurred on.
We are currently extending our distutils/Distribute test system to include installation of a broad range of packages as part of the pre- release process for a future release of Distribute and as part of our "smoke" test for distutils/Distribute. Eventually, the goal is to integrate this with our buildbot system but that's a ways off.
Who is "we"?
We is the people working on Distribute/distutils.
S

[ I'm posting this comment in reply to seeing this thread: * http://thread.gmane.org/gmane.comp.python.distutils.devel/11359 Which has been reposted around - and I've read that thread. I lurk on this list, in case anything comes up that I'd hope to be able to say something useful to. I don't know if this will be, but that's my reason for posting. If this is the wrong place, my apologies, I don't sub to distutils-sig :-/ ]
On Sat, Nov 7, 2009 at 2:30 PM, ssteinerX@gmail.com ssteinerx@gmail.com wrote:
On Nov 7, 2009, at 3:20 AM, Ben Finney wrote:
Guido van Rossum guido@python.org writes:
On Fri, Nov 6, 2009 at 2:52 PM, David Lyon david.lyon@preisshare.net wrote:
[ lots of snippage ] ...
All in all, I think this could be a big leap forward for the Python distribution ecosystem whether or not we eventually write the PyPan I wished for as a new Perl refugee.
Speaking as someone who left the perl world for the python world, many years ago now, primarily due to working on one project, the thing I really miss about Perl is CPAN. It's not the fact that you know you do perl Makefile.PL && make && make test && make install. Nor the fact that it's trivial to set up a skeleton package setup that makes that work for you. It's not the fact that there's an installer than can download & track dependencies.
The thing that makes the difference IMHO is two points: * In a language which has a core ethos "There's more than one way to do it", packaging is the one place where there is one, and only one obvious way to do it. (Oddly with python, with packaging this is flipped - do I as a random project use distutils? pip? setuptools? distribute? virtualenv?) * It has a managed namespace or perhaps better - a co-ordinated namespace.
CPAN may have lots of ills, and bad aspects about it (I've never really trusted the auto installer due to seeing one too many people having their perl installation as a whole upgraded due to a bug that was squashed 6-8 years ago), but these two points are pretty much killer.
All the other aspects like auto download, upload, dependency tracking, auto doc extraction for the website etc, really follow from the managed namespace really. I realise that various efforts like easy_install & distribute & friends make that sort of step implicitly - since there can only be one http://pypi.python.org/pypi/flibble . But it's not quite the same - due to externally hosted packages.
For more detail about this aspect: * http://www.cpan.org/modules/04pause.html#namespace
I'm really mentioning this because I didn't see it listed, and I really think that it's very easy to underestimate this aspect of CPAN.
IMHO, it is what matters the most about CPAN. The fact that they've nabbed the CTAN idea of having an archive network for storing, mirroring and grabbing stuff from is by comparison /almost/ irrelevant IMHO. It is the sort of thing that leads to the DBI::DBD type stuff that is being simple to use, because of the encouragement to talk and share a namespace.
The biggest issue with this is retrofitting this to an existing world.
Personal opinion, I hope it's useful, and going back into lurk mode (I hope :). If this annoys you, please just ignore it.
Michael.
participants (3)
-
Jesse Noller
-
Michael Sparks
-
ssteinerX@gmail.com