OK,I get this when running the buildout for manuel:
Error: Picked: distribute = 0.6.10
Why is that an error, exactly? What does it mean?
Lennart Regebro: Python, Zope, Plone, Grok
+33 661 58 14 64
2010/1/25 Łukasz Rekucki <lrekucki(a)gmail.com>:
> No, cause what the snippet does is: *not B.__lt__(A)* which is
> mathematicly correct and doesn't require A to implement anything.
Yes, sorry, it's when you do a > b the problem arises.
1. a > b first tries a.__gt__(b)
2. b.__gt__(b) in turn does b < a
3. Python's first attempt of b < a is calling b.__lt__(a)
4. but b.__lt__(a) returns NotImplemented
5. Python then tries the opposite, which is a.__gt__(b)
6. Goto 2
At 01:00 PM 1/23/2010 +0100, Tarek Ziadé wrote:
>3 - dir_util, archive_util and file_util are going to be removed in
>favor of calls to shutil.
By removed, do you simply mean that distutils will stop using them,
but the modules will still be there? (i.e., deprecated and phased
out, rather than simply dropped from existence altogether.)
I've started to refactor a few things in Distutils for 2.7, as I
planned to do before the first 2.7 beta.
Here are the major ones FYI:
1- the "sysconfig" module is going to the stdlib, and changes a bit.
Almost all APIs are still present,
and the ones that were removed are left in distutils.sysconfig and
marked as deprecated. (because they can
be changed by calls to other APIs)
IOW, all you have to do at this point for APIs that where moved is
to do "from sysconfig import xx"
instead of "from distutils.sysconfig import xx"
2- My next step is to document this new stdlib module in
docs.python.org and to add 2.7 support in Distribute so people can
2.7 if the want/need.
3 - dir_util, archive_util and file_util are going to be removed in
favor of calls to shutil.
shutil is going to host two new functions: make_archive and
unpack_archive (the latter was introduced by setuptools)
4 - a standalone version of distutils, backward compatible with 2.5
and 2.6, that can be installed by itself, through pypi, is coming.
I have no plan to support 2.4 unless someone strongly wants it.
1 is done already in trunk. and is going to be merged in py3 in a few
days. 2. 3. 4. should be done within the next week.
Tarek Ziadé | http://ziade.org
I'm using distutils to package a tool, and I was thinking that it would
be very useful to have the self.install_lib variable returned in some
way to the user in the setup.py script, after the call of setup().
In that way the user could automate the creation of the .pth file in
order to add the installed module folder to the python search path.
This is useful when the user wants to create a stand-alone package,
working out of the box when unzipping it, without installation in the
system site-packages/ folder.
The folder name used for the installation is written out on the screen
at line 615 of the HEAD of Install.py:
log.debug(("modules installed to '%s', which is not in "
"Python's module search path (sys.path) -- "
"you'll have to change the search path yourself"),
So in some ways we could grab it from there and return it to the user.
Many thanks for your kind attention,
I would like to report an issue with python's distutils. I ran into
trouble with my numpy installation, which seems to be caused by
distutils. What I am to do is to link my numpy shared objects against
Intel MKL (version 10.1)
I have the following in my site.cfg:
library_dirs = /share/apps/intel/mkl/lib/em64t:/share/apps/intel/lib/intel64
mkl_libs = mkl_gf_lp64, mkl_intel_thread, mkl_core, iomp5
This is to follow the link instruction in MKL 10 manual. The catch is,
the two library dirs contain part of the library files to be linked
in, that is:
/share/apps/intel/mkl/lib/em64t contains: mkl_gf_lp64,
/share/apps/intel/lib/intel64 contains: iomp5
It seems that this situation is not handled well by distutils. When I
traced the python build script (by adding printout to
distutils/system_info.py), I found out that search algorithm in
distutils.system_info class (method: check_libs2, which in turn calls
_check_libs, which in turn calls _lib_list) requires that all five
libraries above are sought to be together in one of the subdirs given
above (either /share/apps/intel/mkl/lib/em64t or
/share/apps/intel/lib/intel64). The method cannot handle libraries
that are scattered as my situation described above.
Note that I could have fooled the build system by providing softlink
to all these files in a single subdir, but it seems that system_info
could be fixed to handle situations above without much muddling like
The problem was first detected with numpy version 1.2.1 (which is old,
I know), but a quick inspection of the system_info module included
there shows that not much has changed since then compared to the most
recent distutils (included with my python 2.6.4 installation).
College of William and Mary
Williamsburg, VA 23187
I'm involved with a software project that makes extensive use of third
party modules with C extensions. To ease installation, we pre-build
them for some popular platforms and provide a distribution system for
the app to fetch them.
setuptools uses distutils.util.get_platform() to decide whether an egg
found on the path is compatible, with some special extra magic added in
for Mac OS X. Unfortunately, get_platform() as implemented does not
provide enough information to ensure binary compatibility.
Among the platforms we use, these are the issues:
1) On all platforms, interpreters compiled with UCS2 and UCS4 are binary
incompatible, so modules compiled on one will fail to import on the
other. We work around this by appending -ucsX to the platform string,
based on the value of sys.maxunicode.
2) On OS X, the modification to the value returned by
pkg_resources.get_platform() isn't correct for fat version of Python
2.5, as detailed in setuptools issue 19. To solve that, we're using the
patch I submitted to the issue (with a couple recent changes).
3) On Solaris (and likely other UNIXes where 32 and 64 bit user spaces
coexist), no distinction is made between a 32 bit and 64 bit
interpreter, and they are not binary compatible. We work around this by
checking sys.maxint, and right now in testing I'm appending -32 or -64
to the platform string (but before -ucsX). I haven't settled on this,
though, since I have a feeling maybe it should be part of the arch
(sun4u_32, i86pc_64, etc.) or something like 'solaris32' and 'solaris64'
instead of 'solaris'.
4) Also on Solaris, the OS is binary compatible with older releases, so
Solaris binary distributors will often build on the oldest reasonable
release. This is not possible with setuptools now, although extending
pkg_resources.compatible_platforms() in pretty much the same manner as
is done for OS X solves this (and I can provide a patch).
It's not even crucial to me that these be fixed, but before I continue
to hack up the platform string, I wanted to ask the SIG to address these
issues and hopefully decide on a standard. That way, I can at least
implement patches in my app that will be compatible with whatever (if
anything) is decided.
I just released my first new piece of software using Distribute, and
I'm still confused about whether I have the packaging correct. I
think not, because a user has reported receiving a SandboxViolation
error when trying to install the package with easy_install (bug report
. The relevant portion of the error message seems to be:
Searching for distribute
Best match: distribute 0.6.10
Running distribute-0.6.10/setup.py -q bdist_egg --dist-dir /tmp/
Before install bootstrap.
Scanning installed packages
Setuptools installation detected at /System/Library/Frameworks/
Removing elements out of the way...
Extras/lib/python/setuptools-0.6c7-py2.5.egg-info into /System/Library/
error: SandboxViolation: mkdir('/System/Library/Frameworks/
That looks like the script is trying to install Distribute itself,
based I suppose on having it in the install_requires list in my
setup.py. I added distribute to install_requires after on an apparent
misreading of the instructions on http://pypi.python.org/pypi/
So, my questions:
1. While researching the problem this morning, I came across the
instructions at http://packages.python.org/distribute/setuptools.html#using-setuptools-with…
, which seem like a much more sensible way to bootstrap the
installation of Distribute than install_requires. Should I have used
that instead of install_requires?
2. If I don't require Distribute for installation, will my package
work with the older setuptools (assuming I'm not using any Distribute-
specific features, which I don't think I am, but I'm not sure I know
what those are). In a nutshell, do I need to force my user to install
Distribute at all before using my package? And if so, what is the
preferred way of doing that?
i've installed setuptools and i try to use easy_install but it doesn't
work. ( i use Windows XP)
From the IDLE (GUI) i obtain this error:
>>> easy_install -f http://pypi.python.org/pypi/functest/0.8.8 functest
SyntaxError: invalid syntax
You can tell me why, where i mistake .
*Intecs - Informatica e Tecnologia del Software
*via E. Giannessi,5
Loc. Ospedaletto I-56121 Pisa
<mailto:email@example.com>Telefono (diretto): +39 050 96 57 557
Centralino: +39 050 96 57 411
Fax: +39 050 96 57 400
The contents of this email and any transmitted files are confidential and intended solely for the use of the individual or entity to whom they are addressed. We hereby exclude any warranty and any liability as to the quality or accuracy of the contents of this email and any attached transmitted files. If you are not the intended recipient, be advised that you have received this email in error and that any use, dissemination, forwarding, printing or copying of this email is strictly prohibited. If you have received this email in error please contact the sender and delete the material from any computer.
Am fiddling with a buildout.cfg for my project, but I don't understand
something in script generation.
It seems my console script is generated twice.
And in that second round it adds pylint to the script of my program
and not just the pylint script??
This is the config I have
My setup.py contains:
bank2ledger = bank2ledger.main:main
develop = .
parts = bank2ledger pylint
# pyxdg is not available in PyPI
find-links = http://www.freedesktop.org/~lanius/pyxdg-0.18.tar.gz
recipe = zc.recipe.egg:scripts
recipe = zc.recipe.egg
entry-points = pylint=pylint.lint:Run
arguments = sys.argv[1:]+[
Generated script '/home/olaf/Project/Bank2Ledger/bin/bank2ledger'.
Generated script '/home/olaf/Project/Bank2Ledger/bin/bank2ledger'.
When it creates my script the first time it is what I want:
sys.path[0:0] = [
# [snip] a lot more dependencies
if __name__ == '__main__':
However, this scipt is overwitten by the pylint part with:
sys.path[0:0] = [
# [snip], same list of dependencies as above
if __name__ == '__main__':
But those arguments I wanted the bin/pylint script to have, not the
main entry point of my program.
The bin/pylint script is what I expect it to be.
How come the bank2ledger script is generated twice?
Am I misunderstanding how to use zc.buildout?