[Catalog-sig] simple index and urls exracted from metadata text fields

"Martin v. Löwis" martin at v.loewis.de
Sun Sep 13 12:05:13 CEST 2009


> $ easy_install hachoir-core
> Searching for hachoir-core
> Reading http://pypi.python.org/simple/hachoir-core/
> Reading http://hachoir.org/wiki/hachoir-core   <- this page doesn't
> exists anymore that's an old home url
> 
>      page, you're blocked for a while !!
> 
> If we keep this behavior, the client-side should be more smart.

I disagree. It's the package maintainer's task to make sure the
published URLs actually work.

The maintainer failing to do so, I think users should be more smart
and stop using an unmaintained package. Failing to do so, they should
specify an explicit version. Failing to do so, they deserve waiting for
the timeout.

> We are adding timeout handling in Distribute, and we will probably add
> a special option so it doesn't follow
> external links if some distributions were found at PyPI.
> 
> But we should find a way to remove dead links from PyPI imho.

There is: ask the maintainer of the package to fix the page.

> Maybe by providing a proxy for all links ? So PyPI can fallback to an
> empty page if the link is dead ?

I really fail to see why this is a problem.

Regards,
Martin


More information about the Catalog-SIG mailing list