Re: [Python-Dev] PEP 365 (Adding the pkg_resources module)
At 10:48 AM 3/19/2008 -0700, Guido van Rossum wrote:
I don't understand PyPI all that well; it seems poor design that the browsing via keywords is emphasized but there is no easy way to *search* for a keyword (the list of all packages is not emphasized enough on the main page -- it occurs in the side bar but not in the main text). I assume there's a programmatic API (XML-RPC?) but I haven't found it yet.
http://wiki.python.org/moin/CheeseShopXmlRpc There's also a REST API that setuptools uses: http://peak.telecommunity.com/DevCenter/EasyInstall#package-index-api The API was originally designed for screen-scraping an older version of PyPI, but that has been replaced with a "lite" version served from: http://pypi.python.org/simple/ The "lite" version is intended for tools such as easy_install to process, as it consists strictly of links and can be statically cached. Zope Corp., for example, maintains a static mirror of this API, to guard themselves against PyPI outages and slowdowns, since their buildouts can involve huge numbers of eggs, both their own and external dependencies.
I'd love it if you could write or point me to code that takes a package name and optional version and returns the URL for the source archive, and the type (in case it can't be guessed from the filename or the Content-type header).
You can probably do that with the XML-RPC API. There's a function to get the versions of a package, given a (case-sensitive) name, and there's a function to get information for uploaded archives, given a name and a version. I originally intended to use it for the PEP 365 approach, but you can get the necessary information in just one static roundtrip using the REST (/simple) HTML API, if you're willing to parse the URLs for version information. (The catch of course being that distutils source distributions don't have unambiguously parseable filenames.)
Hm. Why not just use the existing convention for running setup.py after unpacking? This works great in my experience, and has the advantage of having an easy fallback if you end up having to do this manually for whatever reason.
Because I want bootstrap-ees to be able to use the bootstrap mechanism. For example, I expect at some point that setuptools will use other, non-self-contained packages, and other package managers such as zc.buildout et al also want to depend on setuptools without bundling it.
* calling the bootstrap module 'bootstrap', as in 'python -m bootstrap projectname optionalversion'. The module would expose an API to allow it to be used programmatically as well as the command line, so that bootstrapped packages can use the bootstrap process to locate dependencies if they so desire. (Today's package management tools, at least, are all based on setuptools, so if it's not present they'll need to download that before beginning their own bootstrapping process.)
This sounds like going beyond bootstrapping. My vision is that you use the bootstrap module (with the command line you suggest above) once to install setuptools or the alternate package manager of your choice, and then you can use easy_install (or whatever alternative) to install the rest.
Well, I noticed that the other package managers were writing bootstrap scripts that then download setuptools' bootstrap script and run it as part of *their* bootstrap process... and then I got to thinking that it sure would be nice for setuptools to not have to be a giant monolithic download if I wanted to start using other packages in it... and that it sure would be nice to get rid of all these bootstrap scripts downloading other bootstrap scripts... and then I wrote PEP 365. :) One other thing that PEP 365 does for these use cases that your approach doesn't, is that pkg_resources could detect whether a desired package of a usable version was *already* installed, and skip it if so. So, we've already scaled back the intended use cases quite a bit, as people will have to write their own "is it already there?" and "is it the right version?" checks.
Without one or the other, the bootstrap tool would have to grow a version parsing scheme of some type, and play guessing games with file extensions. (Which is one reason I limited PEP 365's scope to downloading eggs actually *uploaded* to PyPI, rather than arbitrary packages *linked* from PyPI.)
There are two version parsers in distutils, referenced by PEP 345, the PyPI 1.2 metadata standard.
Yes, and StrictVersion doesn't parse release candidates. And neither LooseVersion nor StrictVersion supports handling multiple pre/post-release tags correctly. (E.g. "1.1a1dev-r2753")
So, if I had to propose something right now, I would be inclined to propose:
* using setuptools' version parsing semantics for interpretation of alpha/beta/dev/etc. releases
Can you point me to the code for this? What is its advantage over distutils.version?
It implements version comparison semantics that are closer to programmer expectations. It has also been far more widely used and exposed to more feedback. distutils.version, as far as I know, is really only used by the PEP 345 metadata standard -- which isn't used by *any* automated tools as far as I know, and I'm not sure how many packages bother declaring it. In addition to alpha/beta/candidate/dev versions, it also supports post-release (patchlevel) tags such as svn revision or date-based tags. Here is the code; the docstring is actually longer than the bits that do anything: def parse_version(s): """Convert a version string to a chronologically-sortable key This is a rough cross between distutils' StrictVersion and LooseVersion; if you give it versions that would work with StrictVersion, then it behaves the same; otherwise it acts like a slightly-smarter LooseVersion. It is *possible* to create pathological version coding schemes that will fool this parser, but they should be very rare in practice. The returned value will be a tuple of strings. Numeric portions of the version are padded to 8 digits so they will compare numerically, but without relying on how numbers compare relative to strings. Dots are dropped, but dashes are retained. Trailing zeros between alpha segments or dashes are suppressed, so that e.g. "2.4.0" is considered the same as "2.4". Alphanumeric parts are lower-cased. The algorithm assumes that strings like "-" and any alpha string that alphabetically follows "final" represents a "patch level". So, "2.4-1" is assumed to be a branch or patch of "2.4", and therefore "2.4.1" is considered newer than "2.4-1", which in turn is newer than "2.4". Strings like "a", "b", "c", "alpha", "beta", "candidate" and so on (that come before "final" alphabetically) are assumed to be pre-release versions, so that the version "2.4" is considered newer than "2.4a1". Finally, to handle miscellaneous cases, the strings "pre", "preview", and "rc" are treated as if they were "c", i.e. as though they were release candidates, and therefore are not as new as a version string that does not contain them, and "dev" is replaced with an '@' so that it sorts lower than than any other pre-release tag. """ parts = [] for part in _parse_version_parts(s.lower()): if part.startswith('*'): if part<'*final': # remove '-' before a prerelease tag while parts and parts[-1]=='*final-': parts.pop() # remove trailing zeros from each series of numeric parts while parts and parts[-1]=='00000000': parts.pop() parts.append(part) return tuple(parts) component_re = re.compile(r'(\d+ | [a-z]+ | \.| -)', re.VERBOSE) replace = {'pre':'c', 'preview':'c','-':'final-','rc':'c','dev':'@'}.get def _parse_version_parts(s): for part in component_re.split(s): part = replace(part,part) if not part or part=='.': continue if part[:1] in '0123456789': yield part.zfill(8) # pad for numeric comparison else: yield '*'+part yield '*final' # ensure that alpha/beta/candidate are before final To check a parse_version() value for stability, you can just loop over it looking for any part <"*foo" where "foo" is the desired minimum stability. That is, if you find a '*a' and you don't want alphas, then this version's no good. This lets you also distinguish between a beta that you might accept, from an in-development snapshot of a beta, that you wouldn't.
What's wrong with just running "setup.py install"? I'd rather continue existing standards / conventions. Of course, it won't work when setup.py requires setuptools;
Actually, it will, if the setup script uses the current ez_setup bootstrapping method for setuptools. However, I'd like to get *rid* of that bootstrapping method, and replace it with this one. That's why I'd prefer that the bootstrap approach use a different entry point for launching, and why I want the module to expose an API, and why I don't really want the bootstrapper to actually "install" anything. For one thing, it means dealing with installation *options*. Your prototype doesn't pass through any command-line options to the script, so people would have to use a ~/.pydistutils.cfg file in order to control the installation options, for example. (Which then can break if the packager included a setup.cfg that was supposed to be overridden on the command line...) Probably this seems a lot more messy to me, because I've had my face directly planted in the mess for a number of years now, and I know that, for example, people bitched and moaned excessively about not being able to use --prefix with easy_install, the way they could with 'setup.py install'. And maybe my experiences aren't all relevant here; I'm just not very good at turning them off. My skepticism for the setup.py-based approach is at close to "new scheme for removing the GIL" level, because I've gone through a lot of pain to get easy_install from the stage where it looked a lot like your bootstrap prototype, to something that actually works, most of the time, for arbitrary distutils packages. :) And unfortunately, some of the hurdles will require a few release cycles to show up. And hey, if you're okay with that, cool. I just think that as soon as it gets out in the field, people will use it far outside anything we expect it to be used for, and if there's not a bright line for the *packager* to cross, I think we'll have people unhappy with the tool. If you have to do a special step to make something bootstrappable, then when the tool doesn't work, the user will ask the packager to take the special step. However, if the tool allows the user to *point* it at any package, and it randomly (from the user's POV) fails, then the tool (and Python) will be blamed for the failure. Because even though the bootstrap tool is "not a package manager", if it's close enough to look like "a simpler easy_install", people will try to use it as one, and blog about how bootstrap is broken and should support installation options, etc. (I suppose at this point easy_install is something of a counter-example to this worry; people can and do now give packagers patches to make their setup scripts more compatible with easy_install, in cases where the package does extensive distutils modification. OTOH, easy_install is a de facto standard, where bootstrap will be de jure. What does that mean in practice? Heck if I know. :) I guess people will hate on you instead of me, then, so maybe I should view that as an improvement. :) (It also makes it easier to understand your reluctance to be in any way associated with eggs, but there's a big difference between eggs and easy_install, and IMO your approach leans more towards the relative vices of easy_install than the relative virtues of eggs. But oh well.))
On Wed, Mar 19, 2008 at 12:54 PM, Phillip J. Eby
On Mar 19, 2008, at 3:23 PM, Guido van Rossum wrote:
If other people want to chime in please do so; if this is just a dialog between Phillip and me I might incorrectly assume that nobody besides Phillip really cares.
I really care. I've used setuptools, easy_install, eggs, and pkg_resources extensively for the past year or so (and contributed a few small patches). There have been plenty of problems, but I find them to be overall useful tools. It is a great boon to a programming community to lower the costs of re-using other people's code. The Python community will benefit greatly once a way to do that becomes widely enough accepted to reach a tipping point and become ubiquitous. Setuptools is already the de facto standard, but it hasn't become ubiquitous, possibly in part because of "egg hatred", about which more below. I've interviewed several successful Python hackers who "hate eggs" in order to understand what they hate about them, and I've taken notes from some of these interviews. (The list includes MvL, whose name was invoked earlier in this thread.) After filtering out yer basic complaining about bugs (which complaints are of course legitimate, but which don't indict setuptools as worse than other software of comparable scope and maturity), their objections seem to fall into two categories: 1. "The very notion of package dependency resolution and programmable or command-line installation of packages at the language level is a bad notion." This can't really be the case. If the existence of such functionality at the programming language level were an inherently bad notion, then we would be hearing some complaints from the Ruby folks, where the Gems system is standard and ubiquitous. We hear no complaints -- only murmurs of satisfaction. One person recently reported to me that while there are more packages in Python, he finds himself re-using other people's code more often when he works in Ruby, because almost all Ruby software is Gemified, but only a fraction of Python software is Eggified. Often this complaint comes with the idea that eggs conflict with their system-level package management tools. (These are usually Debian/Ubuntu users.) Note that Ruby software is not too hard to include in operating system packaging schemes -- my Ubuntu Hardy apt-cache shows plenty of Ruby software. A sufficiently mature and widely supported setuptools could actually make it easier to integrate Python software into Debian -- see stdeb [1]. 2. "Setuptools/eggs give me grief." What can really be the case is that setuptools causes a host of small, unnecessary problems for people who prefer to do things differently than PJE does. Personally, I prefer to use GNU stow, and setuptools causes unnecessary, but avoidable, problems for me. Many people object (rightly enough) to a "./setup.py install" automatically fetching new software over the Internet by default. The fact that easy_install creates a site.py that changes the semantics of PYTHONPATH is probably the most widely and deservedly hated example of this kind of thing [2]. I could go on with a few other common technical complaints of this kind. These type-2 problems can be fixed by changing setuptools or they can be grudgingly accepted by users, while retaining compatibility with the large and growing ecosystem of eggy software. Certainly fixing setuptools to play better with others is a more likely path to success than setting out to invent a non-egg-compatible alternative. Such a project might never be implemented well enough to serve, and if it were it would probably never overtake eggs's lead in the Python ecosystem, and if it did it would probably not turn out to be a better tool. So, since you asked for my chime, I advise you to publically bless eggs, setuptools, and easy_install as plausible future standards and solicit patches which address the complaints. For that matter, soliciting specific complaints would be a good start. I've done so in private many times with only partial success as to the "specific" part. One promising approach is to request objections in the form of automated tests that setuptools fails, e.g. [3]. Regards, Zooko O'Whielacronx [1] http://stdeb.python-hosting.com/ [2] http://www.rittau.org/blog/20070726-02 And no, PJE's suggested "trivial fix" does not satisfy the objectors, as it can't support the use case of "cd somepkg ; python ./ setup.py install ; cd .. ; python -c 'import somepkg'". [3] http://twistedmatrix.com/trac/ticket/2308#comment:5
On 20/03/2008, zooko
On Mar 19, 2008, at 3:23 PM, Guido van Rossum wrote:
If other people want to chime in please do so; if this is just a dialog between Phillip and me I might incorrectly assume that nobody besides Phillip really cares.
I really care. I've used setuptools, easy_install, eggs, and pkg_resources extensively for the past year or so (and contributed a few small patches). There have been plenty of problems, but I find them to be overall useful tools.
I'll chime in here, too. I really want to like setuptools/easy_install, but I don't. I'll try to be specific in my reasons, in the hope that they can be addressed. I know some of these are "known about", but one of my meta-dislikes of setuptools is that known issues never seem to get addressed (I know, patches accepted, but I haven't got the time either...) 1. No integration with the system packager (Windows, in my case). If I do easy_install nose, then nose does not show up in add/remove programs. That significantly affects the way I manage my PC. 2. No uninstaller. After easy_install nose, how do I get rid of it later? Searching for files to delete (even if there are only a few obviously named ones) is not good enough. 3. The pkg_resources documentation (in particular, that's the one I've tried to follow) is extremely hard to read. Partly this is just style, but it's partly because it is couched in very unfamiliar terms (distributions, working sets, interfaces, providers, etc). It's also *huge*. A tutorial style overview, supported by API detail, would be far better. 4. Hard to use with limited connectivity. At work, I *only* have access to the internet via Internet Explorer (MS based proxy). There are workarounds, but ultimately "download an installer, then run it" is a far simpler approach for me. 5. Auto-discovery doesn't always work. I'm sorry, I really can't recall the example at the moment, but sometimes easy_install says it can't find a package I *know* is available. 6. Splitting the community. Windows users rely heavily on binary installers (at least, I do). We're starting to get a situation where some projects provide .egg files, and some provide traditional (bdist_wininst/bdist_msi) installers. This is bad. One way to do it, and all that :-) But if these problems are solved, then I have no problem with seeing the features of setuptools added to the standard library - resource APIs, plugin/entry point APIs, ways to create executable scripts, and such things *should* be standardised. Dependency resolution and automatic installation isn't something I like (probably because as a Windows user I've never used such a system, so I mistrust it) but if it works *with* the system and not against it, I don't mind. I hope this helps, Paul.
On 09:33 am, p.f.moore@gmail.com wrote:
I'll chime in here, too. I really want to like setuptools/easy_install, but I don't. I'll try to be specific in my reasons, in the hope that they can be addressed. I know some of these are "known about", but one of my meta-dislikes of setuptools is that known issues never seem to get addressed (I know, patches accepted, but I haven't got the time either...)
I agree with almost everything that Paul says, and he put it quite well, so I'll spare the "me too", but I do have some additional gripes to add. setuptools (or, for that matter, distutils) has no business installing stuff in the system directory on a Linux box with a package manager. The *major* benefit I can see to a tool like easy_install is providing a feature that system packagers do not: allowing developers to quickly pull down all their dependencies into a *user directory* without worrying about system administration. However, not only does setuptools not do this on its own, it actively fights me when I try to do it. Admittedly, my user directory is a little messed up. Combinator, the Divmod path management / developer deployment tool, does some inadvisable things to attempt to trick distutils into doing local installation. However, setuptools does have some pretty clear bugs in this area; for example, it hard-codes a copy of a list that's present in site.py to try to figure out where .pth files will be honored, rather than looking at what's actually on sys.path. Every time I've tried to install a package for development using setuptools - and I am speaking across a wide range of versions here, so this isn't a specific bug report - it's either emitted a completely inscrutable traceback or printed an error message telling me that it couldn't or wouldn't install to the provided location.
But if these problems are solved, then I have no problem with seeing the features of setuptools added to the standard library - resource APIs, plugin/entry point APIs, ways to create executable scripts, and such things *should* be standardised. Dependency resolution and automatic installation isn't something I like (probably because as a Windows user I've never used such a system, so I mistrust it) but if it works *with* the system and not against it, I don't mind.
This is more of a vague impression than a specific technical criticism, but it really seems like the implementation of these features face a lot of unnecessary coupling in setuptools. Twisted (Hey, did you guys know I work on Twisted? It seems I hardly ever mention it!), for example, implements resource APIs (twisted.python.modules), plugins (twisted.plugin, which is a bit like some of the uses of entrypoints), and the zip-file agnosticism of both (twisted.python.zipstream) without needing any packaging metadata or ini files. It just introspects the Python path and adds a little frosting to importers. I could be wrong about setuptools' actual design; this could be a documentation or UI issue, because I haven't read the code. But when interacting with it as a user and perusing its API, it definitely seems as though things are woven too tightly together, and the whole thing is very centered around the concept of a "build", i.e. running some kind of compilation or deployment step with a setup.py. One of my favorite things about python is that stuff just runs without needing that normally. I know that "setup.py develop" is supposed to avoid that - but even that itself is a deployment step which generates some metadata. With the setuptools-free libraries I use, it's just check out, then run; no other intermediary step. I'm spoiled, of course, having apt to do the package-management work for me on the majority of my dependencies, and combinator mostly handling the rest. easy_install also definitely has problems with security. It automatically downloads links from plain-HTTP connections, some of them, I believe, from publicly editable wiki pages, and installs them with no verification of signatures (as root! because how else are you going to get them to the only supported installation directory!). I believe that this is possibly the easiest issue to fix though, and I hope that my information here is already out of date. I realize that people are already doing this (insecure installation) with their web browsers, but there are tons of UI cues in a web browser looking at a link on a wiki page which you don't get from an automated command-line tool. As others have said, I wanted to like setuptools. I wanted to believe we could be saved from the unfortunate issues with distutils. But the extremely high degree of frustration I've encountered every time I've tried to use it, its general inscrutability, its awful UI ("python -c "import setuptools; execfile('setup.py')" develop", seriously? you couldn't write a command-line tool to make that look like 'setuptool develop' or something?) and now the availability of my own libraries which do the things in setuptools that interest me, have served to strongly discourage me from trying to closely inspect or fix it. I just kind of vaguely hope that it will be overhauled if it's ever really considered for inclusion in the standard library and I try not to think too hard about it. I'm not actively opposing it, for those who want to use it - c.f. http://twistedmatrix.com/trac/ticket/1286 - but it definitely doesn't work for me. Just for the record: I wrote my own zip-file-friendly resource-loading library after trying to use setuptools. I had to get some code working on an embedded device, and it needed to load Twisted plugins (which predates setuptools by a long while, I believe - or at least I didn't know about them at the time). setuptools somehow simultaneously broke the path requirements of the twisted plugin system and blew my memory budget. I attempted to investigate but didn't get far - it was quite a lot easier to just write some libraries that performed one task at a time rather than trying to manage the whole deployment.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Paul Moore wrote:
I'll chime in here, too. I really want to like setuptools/easy_install, but I don't. I'll try to be specific in my reasons, in the hope that they can be addressed. I know some of these are "known about", but one of my meta-dislikes of setuptools is that known issues never seem to get addressed (I know, patches accepted, but I haven't got the time either...)
Thanks for feedback from the Windows world, from whence I have been blissfully exiled now for years.
1. No integration with the system packager (Windows, in my case). If I do easy_install nose, then nose does not show up in add/remove programs. That significantly affects the way I manage my PC.
Point taken. Of course, it isn't really a "program" at that point: it is an installed "add-on" to Python. However, if Windows users expect such add-ons to show up in the "system" list, that is good to know. I'll note that I use easy_install *only* to work in *non-system* locations: if I want to install Python packages to /usr/lib/python2.x/, I use the standard system installer, e.g. 'apt-get install python-frobnatz'. But I routinely create non-system Python environments for development, using either alternate Pythons or virtualenv: in those environments, it works very well for me.
2. No uninstaller. After easy_install nose, how do I get rid of it later? Searching for files to delete (even if there are only a few obviously named ones) is not good enough.
People ask for this on Unix platforms as well, often adding a request that pacakges installed only as dependencies of the package-being-removed go away as well. If you install everything in a way that works with system package manager, of course, you don't need this. ;) Deleting the 'lib/python2.x/site-packages/foo-X.Y.X.egg' directory is all that is actually required to uninstall an egg that was previouly added via easy_install. Cleaning out the equivalent entry in 'easy_install.pth' in that directory is not strictly required. I wonder if a GUI for managing the add-ons would fit the bill, as an alternative to packaging them as though they were standalone programs?
3. The pkg_resources documentation (in particular, that's the one I've tried to follow) is extremely hard to read. Partly this is just style, but it's partly because it is couched in very unfamiliar terms (distributions, working sets, interfaces, providers, etc). It's also *huge*. A tutorial style overview, supported by API detail, would be far better.
Many of those terms are distutils jargon, actually. I think Jeff Rush' recent work looks like a good start here.
4. Hard to use with limited connectivity. At work, I *only* have access to the internet via Internet Explorer (MS based proxy). There are workarounds, but ultimately "download an installer, then run it" is a far simpler approach for me.
I don't know how to make this requirement compatible with using shared dependencies, except to make it easier for folks to download *all* the requirements, and later install from the local "distribution cache" (a directory full of .zip / .egg / .tgs files). It does turn out to be quite easy to build a PyPI-style "simple" index for such a cache. Your use case would then require: 1. Run some command to fetch the desired package and the transitive closure of its dependencies into a working directory (the cache). 2. Run another command to build an index for that directory. 3. Run 'easy_install', pointing to the local index.
5. Auto-discovery doesn't always work. I'm sorry, I really can't recall the example at the moment, but sometimes easy_install says it can't find a package I *know* is available.
Usually this indicates that there are incompatible dependencies between packages already installed and those on the index. E.g., if I already have package foo installed, but its version is not compatible with the requirements for package bar, then I can't install bar, even though the distribution is "available." Because PyPI is not a centrally-managed index of packages, such conflicts are pretty much inevitable over time for those who don't subset it in some form (what we've been calling the "known good set" strategy in Zope-land).
6. Splitting the community. Windows users rely heavily on binary installers (at least, I do). We're starting to get a situation where some projects provide .egg files, and some provide traditional (bdist_wininst/bdist_msi) installers. This is bad. One way to do it, and all that :-)
If it weren't for the "Add / Remove Programs" requirement you mentioned above, we would be better off if authors of pure Python packages uploaded only 'sdist' distributions, which can be cleanly converted to platform-local eggs at install time, even on Windows. Packages which contain C extensions typically must upload the 'bdist_win' version for the benefit of the vast majority of Windows users who can't bulid the extensions locally. Uploading any other binary distribution is pretty much a lose, because the underlying platform dependencies (UCS2 vs UCS4, i386 vs x64, framework vs. universal vs. MacPorts vs. Fink, etc) lead to combinatorial expolosions and or segfaults. Better to let the installer fetch the source and build it locally.
But if these problems are solved, then I have no problem with seeing the features of setuptools added to the standard library - resource APIs, plugin/entry point APIs, ways to create executable scripts, and such things *should* be standardised. Dependency resolution and automatic installation isn't something I like (probably because as a Windows user I've never used such a system, so I mistrust it) but if it works *with* the system and not against it, I don't mind.
I hope this helps,
Very much, thanks. Tres. - -- =================================================================== Tres Seaver +1 540-429-0999 tseaver@palladion.com Palladion Software "Excellence by Design" http://palladion.com -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.6 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFH4mof+gerLs4ltQ4RApBLAJwI0Be1CtSKgpAYDEyH2qd0K+a+6QCeN/cf 5Pg43ot4H954A87ZWIouwLo= =S4yF -----END PGP SIGNATURE-----
Tres Seaver wrote:
Point taken. Of course, it isn't really a "program" at that point: it is an installed "add-on" to Python. However, if Windows users expect such add-ons to show up in the "system" list, that is good to know.
Are things really that different in the non-Windows worlds? If I want python-nose, I run "sudo apt-get install python-nose" (and that means I can always remove it with "sudo apt-get remove ..."). Seems more similar than different (ignoring the silliness of Microsoft's insistence on "the GUI is the OOWTDI" even for such administrative tasks as installing system-wide software). -- Bob Kline http://www.rksystems.com mailto:bkline@rksystems.com
Bob Kline wrote:
Are things really that different in the non-Windows worlds? If I want python-nose, I run "sudo apt-get install python-nose" (and that means I can always remove it with "sudo apt-get remove ..."). Seems more similar than different (ignoring the silliness of Microsoft's insistence on "the GUI is the OOWTDI" even for such administrative tasks as installing system-wide software).
I was going to -- pointedly -- drop in here the help output
for msiexec, which is the commandline version of the MSI
installation graphical stuff. Only... when I did msiexec /?,
the result was that a Window popped up with the information
in it. (Sort of agrees with your point a bit!)
Still, here's the info (cut-and-pasted from that window):
-----
Windows ® Installer. V 3.01.4000.1823
msiexec /Option <Required Parameter> [Optional Parameter]
Install Options
At 09:44 AM 3/20/2008 -0400, Tres Seaver wrote:
I don't know how to make this requirement compatible with using shared dependencies, except to make it easier for folks to download *all* the requirements, and later install from the local "distribution cache" (a directory full of .zip / .egg / .tgs files). It does turn out to be quite easy to build a PyPI-style "simple" index for such a cache. Your use case would then require:
1. Run some command to fetch the desired package and the transitive closure of its dependencies into a working directory (the cache).
2. Run another command to build an index for that directory.
3. Run 'easy_install', pointing to the local index.
Actually, if someone were to develop a patch for PyPI to do this, we could perhaps have a "display download dependencies" link for eggs shown on PyPI. That way, someone who wants to do a manual download could get a page with links for all the required eggs, and manually download them. (Of course, the other alternative would be for someone to provide an IE-controlling extension to urllib2 so that easy_install wouldn't be proxy-bound on such machines.)
Actually, if someone were to develop a patch for PyPI to do this, we could perhaps have a "display download dependencies" link for eggs shown on PyPI. That way, someone who wants to do a manual download could get a page with links for all the required eggs, and manually download them.
Just to make this position a bit more official (as one of the PyPI maintainers): it would be fully within the scope of PyPI to integrate dependency tracking into its database, and present it in any form that is desired. Any such feature would have to be contributed. Regards, Martin
I'll note that I use easy_install *only* to work in *non-system* locations: if I want to install Python packages to /usr/lib/python2.x/, I use the standard system installer, e.g. 'apt-get install python-frobnatz'.
This is probably not the Windows way of doing things (at least not how I use Windows). Windows doesn't really have the notion of "system location" (or multiple levels of them, where \Windows and \Windows\system32 is "more system" than \Program Files, say). Windows users typically view the entire system as "theirs", and have no concerns at all about putting things into Program Files, system32, or, for that matter, \python25. In fact, they don't care or even know where stuff ends up - they expect that the system will find it, and they expect that they can remove anything they installed even without known where it is - because there is a standard place to look for uninstalling things. Of course, setuptools is not the only piece of software that doesn't play well, so Windows users collect all kinds of cruft over time. Eventually, C: will run out of disk space, and they either get a new machine, or reinstall from scratch.
I wonder if a GUI for managing the add-ons would fit the bill, as an alternative to packaging them as though they were standalone programs?
On Windows, it is fairly easy to have an uninstaller registered. There are wrappers to managing that (such as MSI), but it's really only a set of registry keys that needs to get written on installation time, one of them being the command to run on uninstallation. Assuming that you uninstall the package before uninstalling Python, that uninstall program could be a Python script (although using a cmd.exe batch file would probably be more resilient). The concern with "you just need to delete the folder" is "how am I supposed to know that? and can I be really sure?". If you run the official uninstall procedure, and it messes things up, you can complain to setuptools, or the package author that uninstallation "doesn't work". If you delete stuff manually, and you forgot to remove something in a remote location you didn't even know it existed, you still think it's your own fault. So people are hesitant to actually execute the procedure. Of course, once you *do* provide an entry to "Add/Remove Programs", uninstalling won't be mere deletion, as mere deletion would still leave these registry keys behind (although Windows got more resilient over time to provide cleanup in that case: I believe it offers to remove the ARP entry if the uninstall program has been removed) Regards, Martin
On Mar 20, 2008, at 11:31 AM, Martin v. Löwis wrote:
I'll note that I use easy_install *only* to work in *non-system* locations: if I want to install Python packages to /usr/lib/ python2.x/, I use the standard system installer, e.g. 'apt-get install python-frobnatz'.
This is probably not the Windows way of doing things (at least not how I use Windows). Windows doesn't really have the notion of "system location" (or multiple levels of them, where \Windows and \Windows\system32 is "more system" than \Program Files, say).
Windows users typically view the entire system as "theirs", and have no concerns at all about putting things into Program Files, system32, or, for that matter, \python25. In fact, they don't care or even know where stuff ends up - they expect that the system will find it, and they expect that they can remove anything they installed even without known where it is - because there is a standard place to look for uninstalling things.
While these observations are accurate for most home users, it is worth noting that many IT departments deploy locked-down versions of windows that either have fine-grained group policies to forbid modifications to the system disk (and require the user to write things to a mounted network home directory), or that give write access to the system disk but then re-image it upon reboot. IT departments that deploy this sort of setup usually have the "hostile user" mentality, and that is strongly correlated, in turn, with users that are reluctant to engage IT to allow them install an app. We have run into this a few times, and it would be good to keep this scenario in mind. -Peter
Martin v. Löwis wrote: Martin v. Löwis wrote:
I'll note that I use easy_install *only* to work in *non-system* locations: if I want to install Python packages to /usr/lib/python2.x/, I use the standard system installer, e.g. 'apt-get install python-frobnatz'.
This is probably not the Windows way of doing things (at least not how I use Windows). Windows doesn't really have the notion of "system location" (or multiple levels of them, where \Windows and \Windows\system32 is "more system" than \Program Files, say).
Windows users typically view the entire system as "theirs", and have no concerns at all about putting things into Program Files, system32, or, for that matter, \python25. In fact, they don't care or even know where stuff ends up - they expect that the system will find it, and they expect that they can remove anything they installed even without known where it is - because there is a standard place to look for uninstalling things.
Of course, setuptools is not the only piece of software that doesn't play well, so Windows users collect all kinds of cruft over time. Eventually, C: will run out of disk space, and they either get a new machine, or reinstall from scratch.
I wonder if a GUI for managing the add-ons would fit the bill, as an alternative to packaging them as though they were standalone programs?
On Windows, it is fairly easy to have an uninstaller registered. There are wrappers to managing that (such as MSI), but it's really only a set of registry keys that needs to get written on installation time, one of them being the command to run on uninstallation.
Assuming that you uninstall the package before uninstalling Python, that uninstall program could be a Python script (although using a cmd.exe batch file would probably be more resilient).
The concern with "you just need to delete the folder" is "how am I supposed to know that? and can I be really sure?". If you run the official uninstall procedure, and it messes things up, you can complain to setuptools, or the package author that uninstallation "doesn't work".
If you delete stuff manually, and you forgot to remove something in a remote location you didn't even know it existed, you still think it's your own fault. So people are hesitant to actually execute the procedure.
Of course, once you *do* provide an entry to "Add/Remove Programs", uninstalling won't be mere deletion, as mere deletion would still leave these registry keys behind (although Windows got more resilient over time to provide cleanup in that case: I believe it offers to remove the ARP entry if the uninstall program has been removed)
Regards, Martin _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/python-python-dev%40m.gman...
I'll note that I use easy_install *only* to work in *non-system* locations: if I want to install Python packages to /usr/lib/python2.x/, I use the standard system installer, e.g. 'apt-get install python-frobnatz'.
This is probably not the Windows way of doing things (at least not how I use Windows). Windows doesn't really have the notion of "system location" (or multiple levels of them, where \Windows and \Windows\system32 is "more system" than \Program Files, say).
Windows users typically view the entire system as "theirs", and have no concerns at all about putting things into Program Files, system32, or, for that matter, \python25. In fact, they don't care or even know where stuff ends up - they expect that the system will find it, and they expect that they can remove anything they installed even without known where it is - because there is a standard place to look for uninstalling things.
In point of fact, for an *end user* it makes increasing sense to use application installers that automatically install a correct-version interpreter and all dependencies in a stand-alone manner (i.e. explicitly *not* sharing anything with any other installed application. This makes uninstall much easier, as the lack of external dependencies eases version lock-step problems. It would pain me, as a computer scientist, to do this, but I honestly believe it may be the way forward -- just think, it wouldn't even matter whether an application (and all its extension modules) had been built with VS2003, VS2008 or Mingw. People misunderstood when Mike Driscoll started to provide pure-Python modules as Windows installers, but increasingly your naive Windows programmer is going to be happier doing that. I'm not sure whether that provides easy_f**king_uninstall (Zed Shaw will live on in my memory for that particular PyCon moment), but it (ought to be) relatively easy to do so. Extension modules for programmers still offer more of a challenge, but a build-farm for extension module writers could help there.
Of course, setuptools is not the only piece of software that doesn't play well, so Windows users collect all kinds of cruft over time. Eventually, C: will run out of disk space, and they either get a new machine, or reinstall from scratch.
As someone who just gave away a Windows laptop because I'd become sick of the sight of it I sing amen to that.
I wonder if a GUI for managing the add-ons would fit the bill, as an alternative to packaging them as though they were standalone programs?
On Windows, it is fairly easy to have an uninstaller registered. There are wrappers to managing that (such as MSI), but it's really only a set of registry keys that needs to get written on installation time, one of them being the command to run on uninstallation.
Assuming that you uninstall the package before uninstalling Python, that uninstall program could be a Python script (although using a cmd.exe batch file would probably be more resilient).
The concern with "you just need to delete the folder" is "how am I supposed to know that? and can I be really sure?". If you run the official uninstall procedure, and it messes things up, you can complain to setuptools, or the package author that uninstallation "doesn't work".
I quite agree. I believe it might help if we clearly distinguished between "developer install", to add required functionality to a specific Python installation (and here distutils is almost good enough, setuptools begins to seem like overkill), and "application install" to add a bundle of executable functionality to a computer for an end-user.
If you delete stuff manually, and you forgot to remove something in a remote location you didn't even know it existed, you still think it's your own fault. So people are hesitant to actually execute the procedure.
Of course, once you *do* provide an entry to "Add/Remove Programs", uninstalling won't be mere deletion, as mere deletion would still leave these registry keys behind (although Windows got more resilient over time to provide cleanup in that case: I believe it offers to remove the ARP entry if the uninstall program has been removed)
We need to stop protesting that our installation tools are easy enough and try to get behind the various platforms, be it with Windows installers, rpms, or other support. We probably aren't doing this because it's work nobody particularly relishes, and has relatively low visibility in the developer world. Non-developer Python programmers and end-users would thank us, though. regards Steve
-On [20080320 19:24], Steve Holden (steve@holdenweb.com) wrote:
We need to stop protesting that our installation tools are easy enough and try to get behind the various platforms, be it with Windows installers, rpms, or other support. We probably aren't doing this because it's work nobody particularly relishes, and has relatively low visibility in the developer world. Non-developer Python programmers and end-users would thank us, though.
FreeBSD offers through install of Perl through its ports system a Perl
module called 'bsdpan' which registers every module as a package under
FreeBSD's package system.
Normally ports installs modules as p5-ModuleName, but now it becomes:
/var/db/pkg/bsdpan-B-Lint-1.09
And from that point on I can use the pkg* tools.
Quite elegant in my opinion.
--
Jeroen Ruigrok van der Werven
"Tres Seaver"
On 21/03/2008, Terry Reedy
However, this Windows user, and I expect most, do NOT expect add-ons (things under the /Pythonx.y tree) to show up in the add/remove list.
That's an interesting counterpoint to my comments. I presume from this that you dislike (and/or never use) bdist_msi and bdist_wininst precisely because they do add such items to the add/remove programs list? My argument is essentially that bdist_wininst set a de facto standard for this. Then, bdist_msi followed it. Now setuptools is trying to change that standard by ignoring it, rather than by discussion of the pros and cons. Personally, I like the current approach, but that's less relevant.
The standard (and to me, preferable) way of dealing with such things is to have an 'installation manager' that can reinstall as well as delete and that has a check box for various things to delete. This is what Python needs.
I'd dispute strongly that this is a "standard". It may be preferable, but I'm not sure where you see evidence of it being a standard. Could I also point out that *if* such a standard is set up for Python, bdist_wininst and bdist_msi should be modified to follow it. Otherwise, it's not a standard, more of competing approach. As you can see, my main concern is for consistency :-) Paul.
On Mar 20, 2008, at 7:44 AM, Tres Seaver wrote:
Paul Moore wrote:
4. Hard to use with limited connectivity. At work, I *only* have access to the internet via Internet Explorer (MS based proxy). There are workarounds, but ultimately "download an installer, then run it" is a far simpler approach for me.
I don't know how to make this requirement compatible with using shared dependencies,
We've done something like this. The http://allmydata.org project bundles its easy_installable dependencies. If you get the current trunk from our darcs repository [1], or get a release tarball or a snapshot tarball from [2], then it comes with a directory named "misc/dependencies" which has the source tarballs of our easy_installable dependencies. You can browse this directory on the web: [3]. Therefore, if you manually satisfy the non-easy_installable dependencies, you can download an allmydata.org tarball, disconnect from the Internet (which we call "moving to a Desert Island"), and install it. This is, as you say, "compatible with using shared dependencies" because setuptools will detect if you already have sufficiently new versions of some of these dependencies installed (for example, if they are installed in Debian packages), and then skip the step of installing that dependency from its source tarball. The remaining dependencies that cannot be satisfied automatically by our setup.py are listed in the install.html [4]. They are: 1. g++ >= v3.3 -- the Cygwin version of gcc/g++ works for Cygwin and for Windows 2. GNU make 3. Python >= v2.4.2 including development headers i.e. "Python.h" 4. Twisted >= v2.4.0 -- from the Twisted "sumo" source tarball 5. OpenSSL >= v0.9.7, including development headers 6. PyOpenSSL == v0.6 7. Crypto++ >= v5.2.1, including development headers I am hoping that in the future Twisted (see twisted #1286 [5]) and pyOpenSSL will be easy_installable, and that our use of setuptools plugins will eventually replace our GNUmakefile and thus remove our dependency on GNUmake. That will leave only g++, Python, OpenSSL, and Crypto++ as dependencies that a user has to manually deal with in order to build allmydata.org from source. Regards, Zooko [1] http://allmydata.org/source/tahoe/trunk/ [2] http://allmydata.org/source/tahoe/tarballs/ [3] http://allmydata.org/trac/tahoe/browser/misc/dependencies [4] http://allmydata.org/source/tahoe/trunk/docs/install.html [5] http://twistedmatrix.com/trac/ticket/1286
Paul Moore wrote:
On 20/03/2008, zooko
wrote:
I'll chime in here, too. I really want to like setuptools/easy_install, but I don't. I'll try to be specific in my reasons, in the hope that they can be addressed. I know some of these are "known about", but one of my meta-dislikes of setuptools is that known issues never seem to get addressed (I know, patches accepted, but I haven't got the time either...)
Clearly explained problems with the existing arrangement is valuable as well as patches. Thanks for taking the time to help us see your viewpoint.
1. No integration with the system packager (Windows, in my case). If I do easy_install nose, then nose does not show up in add/remove programs. That significantly affects the way I manage my PC.
Part of this stems from stretching of the original mission of setuptools, to install modules for Python, into a general-purpose application installation tool. The buildout tool is more suited for application installation, although not ideal yet. In your scenario, what happens when one egg pulls in another and another, until you have a hundred entries in your add/remove menu? And you don't understand the inter-relationship of those so you cannot do a clean uninstall? Similarly, or what do you want to appear in that add/remove menu when you are using independent sandboxes with various applications in them, some of which are accessible only to specific users who are not admins on that box?
3. The pkg_resources documentation (in particular, that's the one I've tried to follow) is extremely hard to read. Partly this is just style, but it's partly because it is couched in very unfamiliar terms (distributions, working sets, interfaces, providers, etc). It's also *huge*. A tutorial style overview, supported by API detail, would be far better.
We'll get better docs. Of course, that module contains roughly five different sets of functionality, some of which can be used unrelated to the others so it's not just one API.
4. Hard to use with limited connectivity. At work, I *only* have access to the internet via Internet Explorer (MS based proxy). There are workarounds, but ultimately "download an installer, then run it" is a far simpler approach for me.
This is hard to address using independent eggs re setuptools but fits buildout which provides for deployment of a set of related eggs as a single entity. I'll add it as a use case and see what we can do though.
5. Auto-discovery doesn't always work. I'm sorry, I really can't recall the example at the moment, but sometimes easy_install says it can't find a package I *know* is available.
I've hit a few of these myself, although it wasn't an issue with the auto-discovery mechanism but rather quality problems with PyPI itself. Some of the eggs only had binary distributions provided, and they were not for my platform so couldn't be used. Better error messages in this area would help, with encouragement to nag the original author to provide better data on PyPI.
6. Splitting the community. Windows users rely heavily on binary installers (at least, I do). We're starting to get a situation where some projects provide .egg files, and some provide traditional (bdist_wininst/bdist_msi) installers. This is bad. One way to do it, and all that :-)
Reporting and author nagging facilities built into PyPI could help encourage more consistent behavior. -Jeff
On 20/03/2008, Jeff Rush
Paul Moore wrote:
On 20/03/2008, zooko
wrote: 1. No integration with the system packager (Windows, in my case). If I do easy_install nose, then nose does not show up in add/remove programs. That significantly affects the way I manage my PC. Part of this stems from stretching of the original mission of setuptools, to install modules for Python, into a general-purpose application installation tool. The buildout tool is more suited for application installation, although not ideal yet.
In your scenario, what happens when one egg pulls in another and another, until you have a hundred entries in your add/remove menu? And you don't understand the inter-relationship of those so you cannot do a clean uninstall?
I don't let it. As I've said elsewhere, I prefer to manage dependencies myself, manually. Anything with that many dependencies shouldn't be using the system Python, in my view. It should be packaged as a standalone application (py2exe style) and as such have a single add/remove entry (and no effect on the system Python).
Similarly, or what do you want to appear in that add/remove menu when you are using independent sandboxes with various applications in them, some of which are accessible only to specific users who are not admins on that box?
Independent sandboxes isn't a concept I can relate to under Windows. That doesn't mean it's not possible (I don't know if it is) I just don't have any useful comment to make, beyond saying that I personally don't care what happens in that situation. Paul.
"Jeff Rush"
At 10:18 PM 3/19/2008 -0600, zooko wrote:
The fact that easy_install creates a site.py that changes the semantics of PYTHONPATH is probably the most widely and deservedly hated example of this kind of thing [2].
Yep, this was an unfortunate side effect of eggs growing outside their original ecological niche. Without the 'site' hack, it was impossible to install eggs to user directories and avoid installation conflicts. Specifically, if someone installed a package to PYTHONPATH with the distutils, and then installed a later version using setuptools, the setuptools-installed version would always end up on sys.path *after* the distutils-installed version. Detecting this condition and handling it properly was a major problem for users of easy_install, who wanted it to "just work". Standardization of a PEP 262-style installation database is still needed to address these problems, not to mention uninstallation. Maybe now with some package manager folks paying some attention here, we can do something about that.
[2] http://www.rittau.org/blog/20070726-02 And no, PJE's suggested "trivial fix" does not satisfy the objectors, as it can't support the use case of "cd somepkg ; python ./ setup.py install ; cd .. ; python -c 'import somepkg'".
Well, it replaces the hack being complained about, with the problem that the hack was introduced to fix. :) Again, to properly fix this, we need a metadata standard for who owns what packages -- and it should probably include information about the *tool* that did the installation, so that system packagers can either tell Python-level tools to keep their hands off, or tell Python how to run the tool in question.
On Wed, 2008-03-19 at 22:18 -0600, zooko wrote:
1. "The very notion of package dependency resolution and programmable or command-line installation of packages at the language level is a bad notion."
This can't really be the case. If the existence of such functionality at the programming language level were an inherently bad notion, then we would be hearing some complaints from the Ruby folks, where the Gems system is standard and ubiquitous. We hear no complaints -- only murmurs of satisfaction.
Okay then, just to fill out your sample -- as the maintainer of a Python library which is ported to Ruby, I complain equally about eggs and gems. This isn't really the place for it, but as near as I can tell, the use of gems requires you to know whether the user has installed your dependency in the system install or through a gem *at the time you write your code*, so you know whether to write "require 'dep'" or "require 'rubygems'; gem 'dep'". This is, IMHO, even worse than the "setuptools breaks PYTHONPATH" complaint you cited.
Note that Ruby software is not too hard to include in operating system packaging schemes -- my Ubuntu Hardy apt-cache shows plenty of Ruby software.
Yes, but that software is not installed using the gem management system, as I confirmed with a recent conversation with my package manager while we were talking about http://bugs.debian.org/470282 , a quirk which was hopefully a one-time API breakage, but certainly has not endeared me to rubygems any further. I'm sure we could find other people to complain if we look around a little more. I know I have commiserators out there. But, stepping back a bit: You're right in believing that it is neigh impossible to distribute Ruby software without providing gems. So much of your userbase expects it, especially when you're distributing a library which their applications will in turn depend on, because *their* users will expect gems, and they need to be able to use gems to install the dependency. setuptools seems to perform slightly better here, as, by merely making sure my pypi entry has a reachable download_url, my package seems to be available for installation by setuptools users. Nonetheless, I get a recurring stream of requests for egg distribution from people who believe eggs have manifest destiny, and as we heard recently, that "the controversy is over." Meanwhile, I beg their continued forgiveness for being hesitant to require my users to use something not in the standard library for something as fundamental as "setup.py install." These folks are the same who gave me bug reports when I put a .tar.bz2 link to my pypi entry, because apparently -- even though bz2 extraction has been a feature of GNU tar for years -- setuptools (which uses the standard library tarfile module) on some platforms cannot uncompress bz2 packages. the conclusion I am trying to reach here is this: as a Python package maintainer, I have no idea what the hell to do to satisfy my users, from those who are using python 2.3 and have no desire for any new packaging or import semantics, to those who don't mind having a new ez_setup downloaded on install. The people who have found advantages to using the egg-based distribution system are not going away. Providing something in the standard library will provide clear guidance for me, and relieve me of the fear that I am pushing surprising (<cough>.pth</cough>) or non-standard installation behavior on my users. so, I hope you work something out. Love, - Kevin
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Guido van Rossum wrote:
On Wed, Mar 19, 2008 at 12:54 PM, Phillip J. Eby
wrote: [a long message] I'm back at Google and *really* busy for another week or so, so I'll have to postpone the rest of this discussion for a while. If other people want to chime in please do so; if this is just a dialog between Phillip and me I might incorrectly assume that nobody besides Phillip really cares.
I care, a lot, enough to have volunteered to help with maintenance / development of setuptols back in September 2007. I think that, warts an all, setuptools is a *huge* improvement over bare distutils for nearly every use case I know about. A lot of setuptools warts are driven by related design problems in the distutils, such as the choice to use imperative / procedural code for everything: a declarative approach, with hooks for cases which actually need them (likely 5% of existing packages) would have made writing tools on top of the framework much simpler. It is ironic that Python is *too powerful* a tool for the tasks normally done by distutils / setuptools: a more restricted, and thererfore introspectable, configuration-driven approoach seems much cleaner. Tres. - -- =================================================================== Tres Seaver +1 540-429-0999 tseaver@palladion.com Palladion Software "Excellence by Design" http://palladion.com -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.6 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFH4e7m+gerLs4ltQ4RAt+hAKDBqIrashlgf8U6XRtfMHjTOaiy4gCeO1Zn UfdjDYIb2P6vDCcUGSjITTo= =JTok -----END PGP SIGNATURE-----
At 12:58 AM 3/20/2008 -0400, Tres Seaver wrote:
A lot of setuptools warts are driven by related design problems in the distutils, such as the choice to use imperative / procedural code for everything: a declarative approach, with hooks for cases which actually need them (likely 5% of existing packages) would have made writing tools on top of the framework much simpler. It is ironic that Python is *too powerful* a tool for the tasks normally done by distutils / setuptools: a more restricted, and thererfore introspectable, configuration-driven approoach seems much cleaner.
+1
At 12:58 AM 3/20/2008 -0400, Tres Seaver wrote:
A lot of setuptools warts are driven by related design problems in the distutils, such as the choice to use imperative / procedural code for everything
If a distutils replacement is ever written, I'd like to see it structured as a dependency graph, like a makefile, with each node in the graph knowing how to transform its inputs into its outputs. That would make it a lot easier to extend to accommodate new things like Pyrex. You'd just have to write a new node class that knows how to turn .pyx files into .c files, and the existing machinery would take it from there. -- Greg
-On [20080320 05:58], Tres Seaver (tseaver@palladion.com) wrote:
I think that, warts an all, setuptools is a *huge* improvement over bare distutils for nearly every use case I know about.
Agreed.
I see setuptools (along with PyPI - hopefully much better in near future
though) as the Python equivalent to CPAN and RubyGems.
--
Jeroen Ruigrok van der Werven
Guido van Rossum wrote:
On Wed, Mar 19, 2008 at 12:54 PM, Phillip J. Eby
wrote: [a long message] I'm back at Google and *really* busy for another week or so, so I'll have to postpone the rest of this discussion for a while. If other people want to chime in please do so; if this is just a dialog between Phillip and me I might incorrectly assume that nobody besides Phillip really cares.
Since there seems to be a fair number of negative responses to setuptools, I just wanted to add a bit of positive counterbalance. I'm just a random python user that happens to track python-dev a bit, so take all this with the realization that I probably shouldn't have much input into anything. ;) I've been using python for somewhere around 10 years to write various random small scripts, gui applications and recently web applications. For me setuptools is the best thing to happen to python since I've been using it. I develop and deploy on a seemingly constantly changing mix of various flavors of windows and linux. Unlike for others, I love that once I get setuptools installed I can just use the same commands to get the things I need. I guess the contrast for me is that python is the common base that I tend to work from not the underlying OS. So I don't know if I'm part of a large number of quiet users or just happen to be an odd case that works really well with setuptools. I was disappointed when setuptools didn't make it into 2.5 and I really hope it or something very much like it can make it into a release in the near future. Because while setuptools certainly isn't perfect, for me at least, it is much, much better than nothing at all. Brian Haskin
Janzert wrote:
Since there seems to be a fair number of negative responses to setuptools, I just wanted to add a bit of positive counterbalance. I'm just a random python user that happens to track python-dev a bit, so take all this with the realization that I probably shouldn't have much input into anything. ;)
I've been using python for somewhere around 10 years to write various random small scripts, gui applications and recently web applications. For me setuptools is the best thing to happen to python since I've been using it. I develop and deploy on a seemingly constantly changing mix of various flavors of windows and linux. Unlike for others, I love that once I get setuptools installed I can just use the same commands to get the things I need. I guess the contrast for me is that python is the common base that I tend to work from not the underlying OS.
So I don't know if I'm part of a large number of quiet users or just happen to be an odd case that works really well with setuptools. I was disappointed when setuptools didn't make it into 2.5 and I really hope it or something very much like it can make it into a release in the near future. Because while setuptools certainly isn't perfect, for me at least, it is much, much better than nothing at all.
My interpretation of this is that setuptools suffers from the same malaise all flexible apps do (but especially CLI apps it seems): frequent users love the power and high volume of options, infrequent users despise it. If you're installing apps all day, you probably use it a lot more often than library devs like me who use it once every other month (if we're forced to). Robert Brewer fumanchu@aminus.org
participants (18)
-
"Martin v. Löwis"
-
Bob Kline
-
glyph@divmod.com
-
Greg Ewing
-
Guido van Rossum
-
Janzert
-
Jeff Rush
-
Jeroen Ruigrok van der Werven
-
Kevin Turner
-
Paul Moore
-
Peter Wang
-
Phillip J. Eby
-
Robert Brewer
-
Steve Holden
-
Terry Reedy
-
Tim Golden
-
Tres Seaver
-
zooko