I often just want to do with easy_install is download an egg from pypi without installing it. I've studied the easy_install documentation and never found a way to do it. Even giving it the "-d" option results in easy-install.pth being created and other unwanted stuff.
Looking at the setuptools pydoc I worked out a way to do it:
>>> import setuptools
>>> d = setuptools.Distribution()
Voila! the egg is downloaded into cwd. It even seems built an egg from a tarball.
My question is, can I rely on this feature and is it the best way of doing what I want? I'd like to use it in my code and hope it stays. It would be ideal if I could do this through easy_install.
This is as-documented. Per the docs:
"""If a package is built from a source distribution or checkout, it
will be extracted to a subdirectory of the specified directory."""
That is, --build-dir applies only to SVN checkouts (done by
easy_install itself) and source distributions (i.e. sdist zipfiles
New submission from chris <cdcasey(a)gmail.com>:
Given a set of packages and their dependencies, process all dependencies, find
the intersection, and install dependencies based on the results. The current
behavior is to find the best match for each individual package. This can result
in breakages when different projects depend on different versions of a common
title: [PATCH] Multiple dependency resolution
Added file: http://bugs.python.org/setuptools/file24/merge.patch
Setuptools tracker <setuptools(a)bugs.python.org>
I've been trying to catch up on all the packaging discussions but
couldn't find the right place to reply so thought I'd just do so
Probably the biggest thing that strikes me now is that
distutils/setuptools/distribute/pacman/whatever should aim to do much
In fact, I get the feeling what we really need is a way for package
maintainers to provide the following metadata:
- where the docs are
- where the tests are and how they're run
- how anything not-python should be built
- what the dependencies are
(maybe even what the non-python dependencies are!)
- what version of the package this is
This should be in a build-tool independent fashion such that any build
tools, but especially those of operating system maintainers, can run
over the same metadata and build their packages.
The only other critical thing for me is that *all* of the above metadata
should be available post-install.
With the above in place, we free up the evolution of build tools and let
the OS-specific packaging tools play nicely.
I think a good aim would also be to have some "one-way-to-do-it" python
tools too for:
- installing a package
- uploading a package to PyPI
- getting a package from PyPI
...without any silly big plugin system in the way distutils currently works.
What do other people feel?
Simplistix - Content Management, Zope & Python Consulting
> What do other people feel?
Open Standards. Standardizing data format rather than tools. Well
defined public PyPI API... of course I agree with you!
logilab.fr - services en informatique scientifique et gestion de connaissances
In order to continue the effort started here. We are organizing a
distutils "PEP sprint" with people that works differently to deliver
Some Python developers from the Debian world and Scons specialist will
join. I'll bring the zc.buildout/setuptools point of view.
I would like to come up with enough material to write a meta-PEP and a
series of PEP. These PEPs will then be submitted here for further
Please join !
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
At 03:24 PM 9/25/2008 -0700, Guido van Rossum wrote:
> > Right now there's a momentum in the community, including framework gurus,
> > that are
> > willing to work on a new distutils package. They are not core
> developers but
> > they are really
> > good in distribution matters. Even Phillip Eby said that starting a new
> > distutils could be a good
> > pick in this thread earlier.
>I wasn't there. I'd like to refer to a post by Joel Spolsky about the
>problem with total rewrites:
The economic factors are a bit different, here. Joel himself has
previously pointed out that where Netscape failed, Mozilla won -
i.e., the economics of open source can mean that it's sometimes
easier to get volunteers for a new project than for fixing an old
one, or at least for a project where dropping backward compatibility
is allowed (e.g. Py3K).
In the case of the distutils, the people who are capable of extending
it are tired of doing so, and the people who have the energy and time
are very unlikely to be able to work on it much without breaking
something significant. Distutils is also far too flexible in some
areas to be able to improve much while maintaining 100% backward
compatibility -- it doesn't enforce One Obvious Way To Do It.
What's more, there are very few people who've even said they like the
distutils API or think it's a good fit for the application
area. And, frankly, the domain knowledge embedded in the distutils
are of fairly limited scope and kind:
* Extension building, compile/link options and defines
* Wildly-differing installation path schemes across platforms
* Platform distribution formats like bdist_rpm, bdist_wininst, and bdist_msi
Most other things the distutils do are either well-specified,
obsolete (e.g. its internal logging and option parsing libs), or
probably not worth keeping (e.g. bdist_dumb).
> > Maybe a "distutils 2" project could start outside Python, using distutils
> > and setuptools code as
> > legacy infrastructure, and taking back pieces of it,
> > Then it could be re-integrated in Python as a replacement for
> distutils when
> > it is mature ?
>Only if much effort and study went into the planning of this re-integration.
That's why at least some of the discussion has been around
requirements gathering and PEP-writing as a first step.
My own inclination is that a scalable future for distutils means an
improved sdist format, the end of setup.py as an command-line
interface, and community-maintained platform-specific installation
tools that process source or binary distributions. Most complaints
about distutils (and setuptools, for that matter) are focused on
installation policy&preference issues. Making it possible and
practical for a variety of tools to flourish around a standardized
format (ala WSGI) seems like the way to go.
Notice that the existence of eggs has already allowed buildout,
virtualenv, and pyinstall to appear. But eggs don't handle
installing tests or documentation, and they have to be prebuilt by
platform. An improved sdist format, on the other hand, with
standardized layout and metadata would address all of those issues.
The tools for building this format and APIs for inspecting it would
be candidates for the stdlib, in much the same way that wsgiref was -
the spec is/should be stable, and those parts that are
compiler/install-layout specific will need to be maintained in the
stdlib anyway, for Python's own build infrastructure.
In that sense, "distutils 2" would not be so much a rewrite of the
distutils, as the separation of them into tools for distributing, and
tools for installing, where some of the tools for installing may be
That's the general idea, anyway.
(I'm going to be away from email for a few days, so I'll probably be
out of this thread 'till Tuesday.)
When thinking about compatibility, keep in mind the distinction
between two use cases:
1. You are using a tool (e.g. distutils or setuptools) to package
your work. You might consider switching to a new tool, either one
included in the Python Standard Library or a separately-shipped one,
to build, package, and distribute your software.
2. You are re-using other people's work that they have packaged and
distributed. You might consider switching to a new tool (either
Python Standard or separate) to re-use their work.
Backwards compatibility in the first case is not overwhelmingly
important. Some people will be willing to entirely rewrite their
setup.py scripts to use a new tool, other people will be willing to
use a new tool only if they can keep using their old setup.py
scripts, and still other people will continue to package and
distribute their software with Python 2.5 and the distutils that came
with it for the forseeable future (let's say, for the next 5 years).
We can't prevent them from doing that, but also we don't need to
persuade them to change tools in order to benefit from their work.
Compatibility in the second case is overwhelmingly important. If you
offer me a new tool for re-using other people's source code, and this
new tool does *not* give me access to the thousands and thousands of
Python packages that are already out there and the dozens of hundreds
of new ones that are appearing every month, then this new tool is
completely uninteresting to me.
So we should focus on documenting and standardizing the metadata that
allows code re-use -- the interface between the author of one package
and the author of another package -- not the interface between the
author of a package and the tool that he uses to build and distribute
his own package.
We've already got a pretty good start on this -- the distutils in
Python 2.5 emits PKG-INFO and .egg-info in a format that is
understood by all of the current crop of packaging tools
(easy_install, pyinstall, distribute, yolk, bbfreeze, etc. etc. etc.).
Also of course setuptools produces .egg-info metadata in a way that
those tools can understand.
We also got past the problem that we had for awhile that Linux
distributions like Fedora and Debian were deleting those .egg-info
files. They don't do that anymore.
The next piece that is missing, from my experience in packaging Tahoe
and its numerous Python dependencies , is for more tools to start
emitting "requires" metadata in a compatible way. We already have a
de jure standard for how spell "I require zope.interface" -- PEPs 314
and 345. However, I have never seen metadata of this format in the
wild, and for all I know there are no tools that actually produce or
consume this metadata and no packages that are actually labelled with
We also have a de facto standard that is widely used by a large and
growing library of packages and is supported by a large and growing
set of tools -- the way that setuptools spells "I require
zope.interface" in its .egg-info files.
So, I have a simple and urgent request:
Extend distutils so that when the author of package passes
install_requires=['zope.interface'] to distutils.core.setup(), then
it emits an entry named "requires" with body "zope.interface" in the
resulting .egg-info. That's all. It won't hurt, and it will
probably help quite a lot to facilitate interoperation of future
Python packaging tools.
http://allmydata.org -- Tahoe, the Least-Authority Filesystem
http://allmydata.com -- back up all your files for $5/month