Hi all --
at long last, I have fixed two problems that a couple people noticed a
* I folded in Amos Latteier's NT patches almost verbatim -- just
changed an `os.path.sep == "/"' to `os.name == "posix"' and added
some comments bitching about the inadequacy of the current library
installation model (I think this is Python's fault, but for now
Distutils is slavishly aping the situation in Python 1.5.x)
* I fixed the problem whereby running "setup.py install" without
doing anything else caused a crash (because 'build' hadn't yet
been run). Now, the 'install' command automatically runs 'build'
before doing anything; to make this bearable, I added a 'have_run'
dictionary to the Distribution class to keep track of which commands
have been run. So now not only are command classes singletons,
but their 'run' method can only be invoked once -- both restrictions
enforced by Distribution.
The code is checked into CVS, or you can download a snapshot at
Hope someone (Amos?) can try the new version under NT. Any takers for
BTW, all parties involved in the Great "Where Do We Install Stuff?"
Debate should take a good, hard look at the 'set_final_options()' method
of the Install class in distutils/install.py; this is where all the
policy decisions about where to install files are made. Currently it
apes the Python 1.5 situation as closely as I could figure it out.
Obviously, this is subject to change -- I just don't know to *what* it
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
I've been aware that the distutils sig has been simmerring away, but
until recently it has not been directly relevant to what I do.
I like the look of the proposed api, but have one question. Will this
support an installed system that has multiple versions of the same
package installed simultaneously? If not, then this would seem to be a
significant limitation, especially when dependencies between packages
Assuming it does, then how will this be achieved? I am presently
managing this with a messy arrangement of symlinks. A package is
installed with its version number in it's name, and a separate
directory is created for an application with links from the
unversioned package name to the versioned one. Then I just set the
pythonpath to this directory.
A sample of what the directory looks like is shown below.
I'm sure there is a better solution that this, and I'm not sure that
this would work under windows anyway (does windows have symlinks?).
So, has this SIG considered such versioning issues yet?
Tim Docker timd(a)macquarie.com.au
Quantative Applications Division
qad16:qad $ ls -l lib/python/
drwxr-xr-x 2 mts mts 512 Nov 11 11:23 1.1
-r--r----- 1 root mts 45172 Sep 1 1998 cdrmodule_0_7_1.so
drwxr-xr-x 2 mts mts 512 Sep 1 1998 chart_1_1
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Fnorb_0_7_1
dr-xr-x--- 3 mts mts 512 Nov 11 11:21 Fnorb_0_8
drwxr-xr-x 3 mts mts 1536 Mar 3 12:45 mts_1_1
dr-xr-x--- 7 mts mts 512 Nov 11 11:22 OpenGL_1_5_1
dr-xr-x--- 2 mts mts 1024 Nov 11 11:23 PIL_0_3
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Pmw_0_7
dr-xr-x--- 2 mts mts 512 Nov 11 11:21 v3d_1_1
qad16:qad $ ls -l lib/python/1.1
lrwxrwxrwx 1 root other 29 Apr 10 10:43 _glumodule.so -> ../OpenGL_1_5_1/_glumodule.so
lrwxrwxrwx 1 root other 30 Apr 10 10:43 _glutmodule.so -> ../OpenGL_1_5_1/_glutmodule.so
lrwxrwxrwx 1 root other 22 Apr 10 10:43 _imaging.so -> ../PIL_0_3/_imaging.so
lrwxrwxrwx 1 root other 36 Apr 10 10:43 _opengl_nummodule.so -> ../OpenGL_1_5_1/_opengl_nummodule.so
lrwxrwxrwx 1 root other 27 Apr 10 10:43 _tkinter.so -> ../OpenGL_1_5_1/_tkinter.so
lrwxrwxrwx 1 mts mts 21 Apr 10 10:43 cdrmodule.so -> ../cdrmodule_0_7_1.so
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 chart -> ../chart_1_1
lrwxrwxrwx 1 root other 12 Apr 10 10:43 Fnorb -> ../Fnorb_0_8
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 mts -> ../mts_1_1
lrwxrwxrwx 1 root other 15 Apr 10 10:43 OpenGL -> ../OpenGL_1_5_1
lrwxrwxrwx 1 root other 33 Apr 10 10:43 opengltrmodule.so -> ../OpenGL_1_5_1/opengltrmodule.so
lrwxrwxrwx 1 root other 33 Apr 10 10:43 openglutil_num.so -> ../OpenGL_1_5_1/openglutil_num.so
lrwxrwxrwx 1 root other 10 Apr 10 10:43 PIL -> ../PIL_0_3
lrwxrwxrwx 1 mts mts 10 Apr 10 10:43 Pmw -> ../Pmw_0_7
lrwxrwxrwx 1 root other 10 Apr 10 10:43 v3d -> ../v3d_1_1
I am trying to install distribute-0.6.25 in a windows 7 machine. I have
python 2.7.3 in 32 bits, although the machine is 64 bits. I installed
the 32 bit python version because I want to install ipython and there
is no 64 bit builds of the Windows installer for ipython.
In any case, upon installing disuitils with
python setup.py install
I get the following error
No such file or directory
Any help will be greatly appeciated. Thanks,
Manuel López Mariscal
Depto. de Oceanografía Física/CICESE
hi, please cc
i want to release some versions to pypi, but hide them from installing
by explicitly marking as pre-release. i want to preserve versions
numbering as 0.1, 0.2 according to semver.org point 4. is that
I saw that people from this list are responsible for Wheel related PEP.
I’m comparatively new to the python packaging and need some help understanding the recommended way of dealing with python packages.
I’m trying to create a development infrastructure that would allow for simple and unified ways of sharing, *deploying* and *reusing* the code within private entity. I can see that pip with virtual environments and requirements.txt is very similar to dependency management provided by maven or apache ivy. But there seems to be a disconnect between the egg carrying the possibility to be importable and executable, but in the same time considered to be deprecated format which is not fully supported by pip and virtualenv and wheel not having those basic questions answered...
So, i would like to ask few questions about it :
1. I’m coming from java world, so it is bit hard to understand why new packaging format is not importable? What is the reason behind it? Wouldn’t it be way easier to deal with the dependencies and imports?
2. Why unzipping and installing the wheel is the way to go? There is no need to unzip java jar to import it into your code, neither to run your code with it? Why wheel can’t be used the same exact way?
I would appreciate any insight about the development infrastructure the wheel designers have in mind.
Thanks in advance,
This is in response to Vinay's thread but since I wasn't subscribed to
distutils-sig, I couldn't easily respond directly to it.
Vinay's right, the technology here isn't revolutionary but what's notable
is that we've been using it in production for almost 3 years at Twitter.
It's also been open-sourced for a couple years at
not widely announced (it is, after all, just a small subdirectory in a
fairly large mono-repo, and was only recently published independently to
PyPI as twitter.common.python.)
PEX files are just executable zip files with hashbangs containing a
carefully constructed __main__.py and a PEX-INFO, which is json-encoded
dictionary describing how to scrub and bootstrap sys.path and the like.
They work equally well unpacked into a standalone directory.
In practice PEX files are simultaneously our replacement for virtualenv and
also our way of distributing Python applications to production. Now we
could use virtualenv to do this but it's hard to argue with a deployment
process that is literally "cp". Furthermore, most of our machines don't
have compiler toolchains or external network access, so hermetically
sealing all dependencies once at build time (possibly for multiple
platforms since all developers use Macs) has huge appeal. This is even
more important at Twitter where it's common to run a dozen different Python
applications on the same box at the same time, some using 2.6, some 2.7,
some PyPy, but all with varying versions of underlying dependencies.
Speaking to recent distutils-sig threads, we used to go way out of our way
to never hit disk (going so far as building our own .egg packager and pure
python recursive zipimport implementation so that we could import from eggs
within zips, write ephemeral .so's to dlopen and unlink) but we've since
moved away from that position for simplicity's sake. For practical reasons
we've always needed "not zip-safe" PEX files where all code is written to
disk prior to execution (ex: legacy Django applications that have
__file__-relative business logic) so we decided to just throw away the
magical zipimport stuff and embrace using disk as a pure cache. This seems
more compatible philosophically with the direction wheels are going for
Since there's been more movement in the PEP space recently, we've been
evolving PEX in order to be as standards-compliant as possible, which is
why I've been more visible recently re: talks, .whl support and the like.
I'd also love to chat about more about PEX and how it relates to things
like PEP 441 and/or other attempts like pyzzer.
On 30 January 2014 21:57, Vinay Sajip <vinay_sajip(a)yahoo.co.uk> wrote:
>> My one technical issue is with going beyond zipimport
>> behaviour to the point of extracting DLLs to the filesystem.
>> I remain -1 on that feature, and I believe I have explained why I think
>> there are issues (and why I think that any solution should be part of
>> zipimport and not added on in library or user code). But I'm happy to go
>> through the details again, if you like - or just to accept that I don't
> Yes please, let's get into some details. Of course I understand that you
> might not want to use the feature, but I don't understand the -1 on the feature
> per se - whether it is in distlib or in zipimport is a secondary consideration. I
> agree that zipimport is the logical place for it, but ISTM the reason why it can't
> go in there just yet is also the reason why one might have some reservations
> about the feature: binary compatibility. I accept that this not yet a fully resolved
> issue in general (cf. the parallel discussion about numpy), but if we can
> isolate these issues, we can perhaps tackle them. But for me, that's the main
> reason why this part of the distlib API is experimental.
I actually think this is a useful thing to experiment with, I'm just
not sure distlib is the best place for that experiment. With
appropriately secure tempfile handling and the right sys.path (and
module __path__) manipulation it's not obviously *impossible* to
handle C extensions at arbitrary positions in the module namespace
this way, just difficult. zipimport itself is a bad place to
experiment though, since not only is it currently a complex ball of C
code, but adding such a feature without clear evidence of robust
support in a third party project would be irresponsible.
In the case of distlib, the potential complexity of ensuring that such
a scheme works consistently across multiple platforms and as part of
various complex package layouts is enough to make me nervous about
having it in the same library as the metadata 2.0 reference
Now, if you were to split that functionality out from distlib into a
separate "wheeltab" project (or a name of your choice), I'd be
substantially less nervous, because endorsing distlib as the metadata
2.0 reference implementation wouldn't carry any implications of
endorsing a feature I consider "potentially interesting but rather
challenging to implement in a robust manner". mount() would become
something I could explore when I had some additional free time (hah!),
rather than something I felt obliged to help get to a more robust
state before releasing metadata 2.0.
Nick Coghlan | ncoghlan(a)gmail.com | Brisbane, Australia
This is my first message to this mailing list. I saw a discussion about
construction of SOABI and I thought that it might be tiresome to keep
code to produce that for every possible Python implementation in wheel
generators. After all it's the Python implementation that knows better
in which version and why they broke binary ABI.
Let's say that PyPy 2.2 has this tag set to pp22 but they might have
released PyPy 2.3 without breaking the ABI and prefer to advertise
PyPy2.3 as pp22 compatible thus reducing the number of wheels a python
package maintainer has to produce. The ABI is separate from releases.
Also this would benefit if there were new Python implementations coming
into ecosystem and they could have a function in their stdlib that
advertises their abi level. This would allow these new implementations
to work with wheels instantly.
Does it make sense or it's an utopia?