[Distutils] setuptools-0.4a2: Eggs, scripts, and __file__
rtomayko at gmail.com
Tue Jun 14 13:04:35 CEST 2005
On Jun 13, 2005, at 7:51 PM, Phillip J. Eby wrote:
> At 06:00 PM 6/13/2005 -0400, Ryan Tomayko wrote:
>> So a single package using require() can cause a snowball effect where
>> many other packages would need to be upgraded to egg format as well.
>> In time, this may be a good thing because it could accelerate
>> adoption of eggs but for the time being it makes it really hard to
>> use require().
> I'm not seeing how this is any different than if you just started
> requiring a newer version of a package. I mean, if 'yum' needed a
> newer version of elementtree, it would force an upgrade. So why
> can't you just rely on a later "port number"? ISTM that most
> packaging systems have something like '-1' or 'p1' or
> 'nb1' (NetBSD) tagged on a revision to identify changes in the
> packaging or platform-specific patches applied. Couldn't you use
> that to make your 'yum' RPM depend on egg-packaged versions of its
Sure. It's just that down-level cascading upgrades like this are
generally discouraged if they can be avoided. The legacy.py approach
or a mechanism like it would be preferable until such time as a
general policy for packaging with eggs can be devised by the
distribution. It's not a show stopper, just a mild concern.
> I understand you're saying it's a big problem, but the truth is
> that relatively few existing Python packages have a lot of
> dependencies; the dependency tree of the 237 darwinports is
> probably extremely flat. The problems today of depending on
> anything are such that few people do; this makes it relatively
> simple for the maintainer of a single port to just go ahead and
> upgrade the dependencies, too (organizational issues notwithstanding).
Perhaps your right. The nice thing is that in the case of libraries
(elementtree for example) no upstream changes are needed to the code
and the package can be eggified by the packager. This makes the whole
thing a bit less of an issue.
> But I am obviously no expert in these matters, so I defer to you
> here. I'm just saying that distribution packages that depend on
> more than one or two other packages are rare in Python today, and
> the things that do get depended on, tend to be frequently used, so
> when you do port a dependency, it significantly reduces the number
> of dependencies that *need* to be ported. Thus, I think that
> although the problem appears huge in potential, I think that the
> actual interconnectedness of the packages is probably quite small.
I guess we'll find out in the coming months as maintainers become
more aware of the advantage of egg based development / deployment.
Like I was saying earlier, this may be to our advantage as packages
moving to eggs will nudge others in that direction as well.
>>> Or maybe this could be done by metadata -- you put a legacy.py file
>>> in your egg-info, and when processing your egg's dependencies, if
>>> pkg_resources can't find a package you need, it would call a
>>> function in legacy.py that would check for the dependency using
>>> setuptools' inspection facilities, and return a path and a guess at
>>> a version number.
>>> How does that sound?
>> That would solve my problem perfectly.
> I'll give this some thought for the 0.5/0.6 releases, then.
> Interestingly enough, this technique could possibly give someone
> the opportunity to do things like look for dynamic link libraries
> or headers, check operating system versions, etc.
Yea. I was thinking the same thing. Having a "package preload" area
could be useful in a variety of ways.
>>> I'm not sure exactly what you're trying to do here. If you just
>>> want to know if your script is running from a development location
>>> (and therefore needs to call require() to set up dependencies),
>>> couldn't you just check for 'MyPackage.egg-info' in the sys.path
>>> (script) directory?
>>> import sys, os
>>> if os.path.isdir(os.path.join(sys.path,"MyPackage.egg-
>>> from pkg_resources import require
>>> require("MyPackage") # ensures dependencies get processed
>>> If this is what you want, perhaps we can create a standard recipe
>>> in pkg_resources, like maybe 'script_package("MyPackage")', that
>>> only does the require if you're a development egg and not being run
>>> from run_main().
> You didn't answer this, by the way.
I'm hoping this won't be necessary with legacy.py or some variation
thereof. I wasn't trying to detect a development environment so much
as I was trying to detect whether I was egg managed or some-external-
package-manager managed. The idea was to limit require() calls to
when my package was an egg or had egg meta data, otherwise assume
some external package manager (or manual setup) is responsible for
setting up sys.path.
>> I'd be happy to advocate to / work with packagers once we get a basic
>> set of best practices together. It seems like there are a lot of
>> options here - we just need to iron out the details.
> Yeah; I think that basically the best approach for packaging
> systems will be to run EasyInstall during install and uninstall to
> modify the easyinstall.pth file. I also think that if in Python
> 2.5 we can change the bdist_* commands to create packages this way,
> then that should help, too.
I think so too. Wide changes to packages are not unexpected when the
python version changes so if this did make it into 2.5, I might even
be able to convince fedora-devel packagers to move python packages to
eggs or at least provide .egg-info directories and calls to
EasyInstall for Fedora Core 5. I'm not very close to the BSD ports or
other packaging communities but I wouldn't have a problem advocating
the same to them if things went well with fedora.
rtomayko at gmail.com
More information about the Distutils-SIG