I know it is a bad practice for a recipe to return some paths that
contains important data in the install() method,
because zc.buildout might remove them.
Nevertheless, it happens from time to time that a developer lose some
content because of a misconfiguration,
or a zealous recipe. That is his responsability, and backups are done for that.
But I think we can improve zc.buildout a bit for that:
what about introducing a safe-mode where the developer get prompted
everytime zc.buildout.rmtree is about to call
The option could be set in [buildout] like this:
safe-mode = true
and challenge the user when a tree is about to be delete. (it might be
overkill for single files,
and they are most of the time unimportant configuration files)
This is a small change, and it would avoid running a backup everytime
the .cfg are changed. because
it happens all day long when you are developing.
Tarek Ziadé | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/
At 02:10 PM 10/1/2008 +0200, Tarek Ziadé wrote:
>this is also a synthezis of what I hurd, and some elements I have
>added to respect the needs that were expressed.
>0/ a lot of work can be done to clean distutils, no matter what is
>decided (another PEP is built for that) cleanning, removing old-style
>1/ let's change the Python Metadata , in order to introduce a better
>dependency system, by
> - officialy introduce "install requires" and "test requires"
> metadata in there
> - mark "requires" as deprecated
>2/ Let's move part of setuptools code in distutils, to respect those changes.
>3/ let's create a simple convention : the metadata should be expressed
>in a python module called 'pkginfo.py'
> where each metadata is a variable.
> that can be used by setup.py and therefore by any tool that work
>with it, even if it does not run
> a setup.py command.
> This is simpler, this is cleaner. you don't have to run some setup
>magic to read them.
> at least some magic introduces by commands
I'm -1 on all of the above. I think we need a standard for tools
interop (ala WSGI), not implementation tweaks for the existing
tools. I also think that a concrete metadata format proposal is
premature at this time; we've barely begun to gather -- let alone
specify -- our requirements for that metadata. (Essentially, only
version dependencies have been discussed, AFAICT.)
There have been many people agreeing that the distutils are
thoroughly broken and a new approach is needed; these proposals
sound like minor tweaks to the existing infrastructure, rather than a
way to get rid of it. So to me, the above doesn't seem like a
synthesis of the threads that I've been reading.
>4/ let's change PyPI to make it work with the new metadata and to
>enforce a few things
> - a binary distribution cannot be uploaded if a source distrbution
>has not been previously provided for the version
Note that this doesn't allow closed-source packages to be uploaded;
thus it would need to be a warning, rather than a requirement.
> - we should be able to download the metadata of a package without
>downloading the package
> - PyPI should display the install and test dependencies in the UI
It could only do this for specific binaries, since dependencies can be dynamic.
(I'll be CC'ing the distutils sig in to these replies as this discussion
probably belongs there...)
Nicolas Chauvat wrote:
>> The slides for my two talks can be found here:
> "Python Package Management Sucks"
> Install debian and get back to productive tasks.
This is an almost troll-like answer.
See page 35 of the presentation.
If you're too lazy, here's a re-hash:
- there are lots of different operating systems
- even with Linux, there are many different package management systems
- all the package management systems behave differently and expect
packages to be set up differently for them
- expecting package developers to shoulder this burden is unfair and
results in various bitrotting repackaged versions of python packages
rather than just one up-to-date version maintained by the entity
originating the python package
- Adobe Photoshop Plugins, Firefox Add-ons, etc do not delegate their
management to an OS package manager. Packages are Python's "plugins" and
so should get the same type of consistent, cross-platform package
management targetted at the application in question, which is Python in
Simplistix - Content Management, Zope & Python Consulting
Distutils has made my life easier as an python application author and packager
for some time but when it comes to generating and distributing message
catalogues for translations, things get unexpectedly tricky.
It would be nice to be able to do something like,
from distutils.core import setup
description="A handy, translated application",
po_files = [('share/locales', ['po/*.po'])]
and have distutils do the right thing with the .po files at build time (generate
.mo files from them) and at install time (install them into
PREFIX/share/locales/LC_MESSAGES/, or wherever the distribution is configured to
My big question is, are there plans to add msgfmt functionality into distutils?
Googling tells me there was some discussion about this back in 2003 but I can't
find any sign of progress since then.
At 02:15 PM 10/1/2008 -0500, chris wrote:
>Is it just me, or does --root work the same way
>--single-version-externally-managed does, just with the addition of
--root automatically triggers --single-version-externally-managed, in
addition to distutils' normal handling of --root.
I am having troubles working with setuptools on Solaris. The Solaris
operating system normally installs modules as packages which contain
binaries. This is unlike other Linux operating systems where, for
exmaple, an RPM would download the source and build it on the user's
machine when they install the RPM.
So, to create packages on Solaris we normally install a module to
a temporary directory such as /var/tmp/pkgbuild-foo/usr, package up
the files that are built, and when the user installs the package
these files are then installed to their system.
However, we are having problems with figuring out how to properly
create the /usr/lib/python2.4/site-packages/easy-install.pth file
using our build system. What happens is that each package ends up
with its own easy-install.pth file which only contains the
information for that one module. Users can't install two such
packages at the same time because it creates a file conflict, two
packages can't install the same file.
Solaris packages do have pre-install, post-install, pre-uninstall
and post-uninstall scripts so that we could do something like avoid
installing the file as a part of the package and instead generate it
on-the-fly via scripting. However, it doesn't seem that setuptools
provides any mechanisms for doing this easily. Though I'm not very
familiar with setuptools or easy_install, so I hope that I'm missing
Aside from writing our own code to manually manage adding and removing
entries to/from this file, is there any way that setuptools allows you
to manage this file when installing binaries to a system, or does
setuptools assume the file only needs to be managed when you build a
I'm hoping to avoid some hacky solution where we try and hack the
easy-install.pth file by hand when users install or uninstall packages.
Since users can install or uninstall any random combination of
packages which may need an easy-install.pth file, I can imagine that
it would get complicated to properly manage it via our own scripts.
Or is there some other solution that might be useful in our situation
that might avoid the need for multiple packages to need this file, or
might it be possible for different modules to use differently named
files instead of a commonly named easy-install.pth file.
I've been using distutils for a while now, but today I'm running
into what seems to be the "minimal strange issue". I want to in-
stall a single Python module without anything else, no package
around it, which resides in a source subdirectory of the main
project directory. My layout looks like the following:
Macintosh:Python dinu$ tree2.py -f mymodule
| | mymodule.py
Now common sense says all I need to do is define the module in
setup.py like this:
Macintosh:mymodule dinu$ more setup.py
from distutils.core import setup
py_modules = ["src/mymodule"],
Then, distutils does the following when running setup.py:
Macintosh:mymodule dinu$ py252 setup.py build
copying src/mymodule.py -> build/lib/src
And of course, I get the following structure built:
Macintosh:mymodule dinu$ tree2.py -f build
| | src/
| | | mymodule.py
And sure enough I don't want the "src" level. I can get rid
of it, but only after moving mymodule.py into the the main
project directory, and removing the "src/" in py_modules.
Instead, what I would *really* like is some smart combination
of setup parameters that would just do it. But so far, all my
efforts in finding combinations of py_modules, packages,
package_dir (dummy/fake package) and even package_data does
not do the trick. Now I'm desperately looking for a distutils
"Houdini" on this list.