I've been aware that the distutils sig has been simmerring away, but
until recently it has not been directly relevant to what I do.
I like the look of the proposed api, but have one question. Will this
support an installed system that has multiple versions of the same
package installed simultaneously? If not, then this would seem to be a
significant limitation, especially when dependencies between packages
Assuming it does, then how will this be achieved? I am presently
managing this with a messy arrangement of symlinks. A package is
installed with its version number in it's name, and a separate
directory is created for an application with links from the
unversioned package name to the versioned one. Then I just set the
pythonpath to this directory.
A sample of what the directory looks like is shown below.
I'm sure there is a better solution that this, and I'm not sure that
this would work under windows anyway (does windows have symlinks?).
So, has this SIG considered such versioning issues yet?
Tim Docker timd(a)macquarie.com.au
Quantative Applications Division
qad16:qad $ ls -l lib/python/
drwxr-xr-x 2 mts mts 512 Nov 11 11:23 1.1
-r--r----- 1 root mts 45172 Sep 1 1998 cdrmodule_0_7_1.so
drwxr-xr-x 2 mts mts 512 Sep 1 1998 chart_1_1
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Fnorb_0_7_1
dr-xr-x--- 3 mts mts 512 Nov 11 11:21 Fnorb_0_8
drwxr-xr-x 3 mts mts 1536 Mar 3 12:45 mts_1_1
dr-xr-x--- 7 mts mts 512 Nov 11 11:22 OpenGL_1_5_1
dr-xr-x--- 2 mts mts 1024 Nov 11 11:23 PIL_0_3
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Pmw_0_7
dr-xr-x--- 2 mts mts 512 Nov 11 11:21 v3d_1_1
qad16:qad $ ls -l lib/python/1.1
lrwxrwxrwx 1 root other 29 Apr 10 10:43 _glumodule.so -> ../OpenGL_1_5_1/_glumodule.so
lrwxrwxrwx 1 root other 30 Apr 10 10:43 _glutmodule.so -> ../OpenGL_1_5_1/_glutmodule.so
lrwxrwxrwx 1 root other 22 Apr 10 10:43 _imaging.so -> ../PIL_0_3/_imaging.so
lrwxrwxrwx 1 root other 36 Apr 10 10:43 _opengl_nummodule.so -> ../OpenGL_1_5_1/_opengl_nummodule.so
lrwxrwxrwx 1 root other 27 Apr 10 10:43 _tkinter.so -> ../OpenGL_1_5_1/_tkinter.so
lrwxrwxrwx 1 mts mts 21 Apr 10 10:43 cdrmodule.so -> ../cdrmodule_0_7_1.so
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 chart -> ../chart_1_1
lrwxrwxrwx 1 root other 12 Apr 10 10:43 Fnorb -> ../Fnorb_0_8
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 mts -> ../mts_1_1
lrwxrwxrwx 1 root other 15 Apr 10 10:43 OpenGL -> ../OpenGL_1_5_1
lrwxrwxrwx 1 root other 33 Apr 10 10:43 opengltrmodule.so -> ../OpenGL_1_5_1/opengltrmodule.so
lrwxrwxrwx 1 root other 33 Apr 10 10:43 openglutil_num.so -> ../OpenGL_1_5_1/openglutil_num.so
lrwxrwxrwx 1 root other 10 Apr 10 10:43 PIL -> ../PIL_0_3
lrwxrwxrwx 1 mts mts 10 Apr 10 10:43 Pmw -> ../Pmw_0_7
lrwxrwxrwx 1 root other 10 Apr 10 10:43 v3d -> ../v3d_1_1
I am trying to install distribute-0.6.25 in a windows 7 machine. I have
python 2.7.3 in 32 bits, although the machine is 64 bits. I installed
the 32 bit python version because I want to install ipython and there
is no 64 bit builds of the Windows installer for ipython.
In any case, upon installing disuitils with
python setup.py install
I get the following error
No such file or directory
Any help will be greatly appeciated. Thanks,
Manuel López Mariscal
Depto. de Oceanografía Física/CICESE
This url is redirecting to itself, causing bandersnatch to fail, and
thus its never updating its serial, so its pulling gradually larger
and larger datasets.
Reusing existing connection to pypi.python.org:443.
HTTP request sent, awaiting response...
HTTP/1.1 301 Moved Permanently
Date: Tue, 25 Feb 2014 02:14:48 GMT
Cache-Control: max-age=86400, public
Keep-Alive: timeout=10, max=31
Robert Collins <rbtcollins(a)hp.com>
HP Converged Cloud
A bit of context: were running buildout 2.0.1, building zope and plone
applications, zope 2.13.10, combined with plone.recipe.zope2instance. This
question only impacts the performance of the startup of an instance, the the
performance of a running instance. But starting an instance is something
developpers do quite often on a days development.
While debugging some instance startup performance issues, I came across the
The buildout Scripts prepends all the eggs to the system path, before the
In our setup this causes quite some delays, because imports from standard
python modules, also try to find that module in every eggs directory, before
it can find in in the default python location (because the eggs are
prepended). The eggs even go before the local folder, so even for importing
a local module, all eggs are poked first. If you are only working on local
disks, the starup performance difference is neglectable (speaking seconds),
but if the eggs are located on network disks, there is a performance
difference of about 30% in startup time (speaking minutes), with the path
prepended vs appended (where appended is the faster startup).
Some numbers from strace: with eggs appended to sys.path:
% time seconds usecs/call calls errors syscall
------ ----------- ----------- --------- --------- ----------------
48.96 0.014658 0 142555 134820 open
And with eggs prepended to sys.path
% time seconds usecs/call calls errors syscall
------ ----------- ----------- --------- --------- ----------------
58.82 0.035995 0 199897 192150 open
As you can see the amount of calls, and consequently the amount of errors,
is noticeable higher when prepending the eggs to the sys.path. On local
disks, like the above numbers, the difference is noticeable, but still
fairly small (timewise, not perdcentage wise), add the bit of extra delay of
a network disk, and the differences become really noticable.
As far as Ive always understood, the default procedure working with paths
should be to append, unless you have a good reason. The good reason in this
case, that I see, could be that you want to prepend certain packages and
that way make sure you use your version instead of whats present in
site-packages. Now my expectation would be that that is a fairly limited set
of packages that need to be prepended. If there are many, options like
virtualenv exist to avoid taking site-packages at all (thats what we do
Am I missing a use case for the sys.path prepending, or has this never been
an issue before? Because if there is only the site-packages, or a like
issue, Im happy to have a look into splitting the python path up in what
should be prepended (whole python path except site-packages?), and making
append the default (site-packages + eggs + ), or looking how I could
provide a buildout syntax, where append would be default and one could
explicitly prepend some packages with buildout.cfg.
Another option Ive investigated is to use the meta_path hook, providing my
own find_module and load_module, keeping a dictionary of module locations. I
dont have similar timings about the numbers of syscalls like the ones above
for this scenario. Im reconstructing that setup for the moment and see if
that makes a difference. Timing wise its a bit slower (but still
acceptable, instance starts in 30sec instead of 15/20), but I have no
measurements yet, that indicate something usefull.
Thanks for your feedback,
Hello packaging community,
I'm investigating ways of setting up Python projects at my workplace.
We're predominantly a Java shop, but we might be dipping our toes in
Python waters (finally!) due to a fortuitous project and my multi-year
insistence, so I'm contemplating how to set up our Python build system
to minimize workflow differences for other developers (well, and myself).
I've actually written uš a lengthy description of Maven and why we use
it but I'll spare you for now. :) To keep the story short, I'm
interested in options for setting up a multi-module Python project. By
'multi-module' I don't mean a single setuptools-based project with
several .py files inside, but a way of triggering a complex build
process with a single command that would build all sub-modules
(essentially sub-projects) and produce a number of end artifacts - just
like Maven. Imagine a repository containing 30 separate Django apps,
packaged independently, 10 utility libraries, 10 Django projects
combining those app, and 10 RPM building projects for packaging it all
up for deployment.
As far as I know, just using setuptools isn't adequate for a workflow
like this - setuptools deals with the build process (testing, packaging,
etc) of a single project only. Solutions that come to mind are: a
hierarchy of Makefiles, shell scripts, or maybe Twitter's Pants, which
sort of looks like Maven for Python but would probably need
contributions to do what we want, and looks predisposed to building PEX
files which, while very interesting, I'm not looking to do right now.
None of these solutions are really ideal, especially if I want to
support development on Windows (which I absolutely want).
I've even thought about actually using Maven, but that's just a
Pandora's box of problems waiting to happen.
I'd appreciate insight on this from anyone who's thought about (and
maybe solved) problems like this. I'm also willing to engage and
contribute to improving the situation, especially if there's low hanging
fruit to be picked. How do other companies handle large Python
repositories with a lot of subcomponents?
I am trying to publish some binaries with Wheel for the first time.
Building PySide with
I would like how the OS version is determined, or even more, how this is
Without any argument, when building PySide, I get two different
For some reason, one python is supported by Mavericks and one is not.
Also, I am confused by the different output from the two.
I would like to be able to control this and say where the stuff should
run. I also cannot figure out how mandatory these strings are.
Is a version enforced? Is it recommended?
What will PyPi say?
And how do I see which version PyPi use when I """pip install xxx" ?
How do I define that? Did not find that in the docs, but that is due to
my impaired vision.
Any help/advice would be appreciated.
Cheers -- Chris
Christian Tismer :^) <mailto:email@example.com>
Software Consulting : Have a break! Take a ride on Python's
Karl-Liebknecht-Str. 121 : *Starship* http://starship.python.net/
14482 Potsdam : PGP key -> http://pgp.uni-mainz.de
phone +49 173 24 18 776 fax +49 (30) 700143-0023
PGP 0x57F3BF04 9064 F4E1 D754 C2FF 1619 305B C09C 5A3B 57F3 BF04
whom do you want to sponsor today? http://www.stackless.com/
Hi lovely distutils people,
I have a question, as I prepare for my "Python packaging simplified, for
end users, app developers, and open source contributors" talk. I'm sure
I'll have more; I'll end up probably making a few threads about them, since
they'll come to me at random times.
For years, I've been recommending:
$ python setup.py develop
as a standard way to make something hackable and available in a virtualenv.
I notice that "python setup.py develop --user" exists, which is great, as
it means that you don't even need to bother with the virtualenv.
Having said that, I also notice that:
$ pip install -e .
does the same thing.
Should I be recommending one over the other?
I'm going to learn toward "pip install -e ." even though I haven't been
using it much personally, as it makes the talk more consistent -- I would
then be able to say, "Always use pip for doing your installing." But I
thought I'd ask about this. It seems that "pip install -e ." is the same as
"python setup.py develop" except that pip runs setup.py with setuptools
available, which addresses a problem where if the maintainer of a package's
setup.py file doesn't "from setuptools import setup", then "python setup.py
develop" won't work, whereas "pip install -e ." will always work.
Unless I'm mistaken. So the question is -- can someone sanity-check the
I'm hoping to pretend to be an outsider for the purpose of empathizing with
the audience, and yet be enough of an insider to ask people on this list if
what I'm saying is consistent with Modern PyPA Doctrine (which generally
I'm happy to promote).
I'm having trouble getting setuptools to properly extract the svn revision and
write it into the <package>.egg-info/PKG-INFO file in a new environment. It
has worked correctly for years in our previous environments, so I'm not sure
what I'm doing wrong at this point.
The environments where this works are virtualenvs based on Python 2.6 (and
previously to that, 2.5) on various versions of Ubuntu. The new environment
is a Python 2.7.6 virtualenv on what will become Ubuntu 14.04 "Trusty Tahr"
when that's finalized.
The system-wide Python used to bootstrap the virtualenv has setuptools 3.3
installed. I've also installed setuptools 3.4.3 in the virtualenv to see if
that helped, but it makes no difference. Our package's setup.cfg file
contains the usual:
tag_build = dev
tag_svn_revision = true
Running `python setup.py develop` (or egg_info directly, or one of the other
commands that call egg_info) results in the <package>.egg-info/PKG-INFO
(with all other values being UNKNOWN, which is fine). In our other
environments, the version properly gets the svn revision number appended, like
The upcoming Ubuntu release features subversion 1.8.8 instead of the 1.6.x in
the last long-term-support release, so I thought that might be the source of
the problem -- but manually running the code from svn_utils.py shows it seems
>>> from setuptools.svn_utils import *
>>> working_copy = '/path/to/srcdir/subdir'
In case it's relevant, setup.py/setup.cfg and the package I'm dealing with are
in a subdirectory of the working copy, so the .svn/ dir and its files etc are
in the parent directory, not the directory setup.py runs in. `svnversion`
correctly fetches the revision number regardless, of course.
I haven't found any docs/FAQs/etc that would seem to be related - all the
subversion-related issues seem to be from the earlier transition to 1.5 or
1.6, which is long past.
Am I doing something wrong here? Can anyone suggest why the revision tagging
is failing to work in my new environment?
Charles Cazabon <charlesc-distutils-python.org(a)pyropus.ca>
Software, consulting, and services available at http://pyropus.ca/