Hi all --
at long last, I have fixed two problems that a couple people noticed a
* I folded in Amos Latteier's NT patches almost verbatim -- just
changed an `os.path.sep == "/"' to `os.name == "posix"' and added
some comments bitching about the inadequacy of the current library
installation model (I think this is Python's fault, but for now
Distutils is slavishly aping the situation in Python 1.5.x)
* I fixed the problem whereby running "setup.py install" without
doing anything else caused a crash (because 'build' hadn't yet
been run). Now, the 'install' command automatically runs 'build'
before doing anything; to make this bearable, I added a 'have_run'
dictionary to the Distribution class to keep track of which commands
have been run. So now not only are command classes singletons,
but their 'run' method can only be invoked once -- both restrictions
enforced by Distribution.
The code is checked into CVS, or you can download a snapshot at
Hope someone (Amos?) can try the new version under NT. Any takers for
BTW, all parties involved in the Great "Where Do We Install Stuff?"
Debate should take a good, hard look at the 'set_final_options()' method
of the Install class in distutils/install.py; this is where all the
policy decisions about where to install files are made. Currently it
apes the Python 1.5 situation as closely as I could figure it out.
Obviously, this is subject to change -- I just don't know to *what* it
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
I've been aware that the distutils sig has been simmerring away, but
until recently it has not been directly relevant to what I do.
I like the look of the proposed api, but have one question. Will this
support an installed system that has multiple versions of the same
package installed simultaneously? If not, then this would seem to be a
significant limitation, especially when dependencies between packages
Assuming it does, then how will this be achieved? I am presently
managing this with a messy arrangement of symlinks. A package is
installed with its version number in it's name, and a separate
directory is created for an application with links from the
unversioned package name to the versioned one. Then I just set the
pythonpath to this directory.
A sample of what the directory looks like is shown below.
I'm sure there is a better solution that this, and I'm not sure that
this would work under windows anyway (does windows have symlinks?).
So, has this SIG considered such versioning issues yet?
Tim Docker timd(a)macquarie.com.au
Quantative Applications Division
qad16:qad $ ls -l lib/python/
drwxr-xr-x 2 mts mts 512 Nov 11 11:23 1.1
-r--r----- 1 root mts 45172 Sep 1 1998 cdrmodule_0_7_1.so
drwxr-xr-x 2 mts mts 512 Sep 1 1998 chart_1_1
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Fnorb_0_7_1
dr-xr-x--- 3 mts mts 512 Nov 11 11:21 Fnorb_0_8
drwxr-xr-x 3 mts mts 1536 Mar 3 12:45 mts_1_1
dr-xr-x--- 7 mts mts 512 Nov 11 11:22 OpenGL_1_5_1
dr-xr-x--- 2 mts mts 1024 Nov 11 11:23 PIL_0_3
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Pmw_0_7
dr-xr-x--- 2 mts mts 512 Nov 11 11:21 v3d_1_1
qad16:qad $ ls -l lib/python/1.1
lrwxrwxrwx 1 root other 29 Apr 10 10:43 _glumodule.so -> ../OpenGL_1_5_1/_glumodule.so
lrwxrwxrwx 1 root other 30 Apr 10 10:43 _glutmodule.so -> ../OpenGL_1_5_1/_glutmodule.so
lrwxrwxrwx 1 root other 22 Apr 10 10:43 _imaging.so -> ../PIL_0_3/_imaging.so
lrwxrwxrwx 1 root other 36 Apr 10 10:43 _opengl_nummodule.so -> ../OpenGL_1_5_1/_opengl_nummodule.so
lrwxrwxrwx 1 root other 27 Apr 10 10:43 _tkinter.so -> ../OpenGL_1_5_1/_tkinter.so
lrwxrwxrwx 1 mts mts 21 Apr 10 10:43 cdrmodule.so -> ../cdrmodule_0_7_1.so
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 chart -> ../chart_1_1
lrwxrwxrwx 1 root other 12 Apr 10 10:43 Fnorb -> ../Fnorb_0_8
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 mts -> ../mts_1_1
lrwxrwxrwx 1 root other 15 Apr 10 10:43 OpenGL -> ../OpenGL_1_5_1
lrwxrwxrwx 1 root other 33 Apr 10 10:43 opengltrmodule.so -> ../OpenGL_1_5_1/opengltrmodule.so
lrwxrwxrwx 1 root other 33 Apr 10 10:43 openglutil_num.so -> ../OpenGL_1_5_1/openglutil_num.so
lrwxrwxrwx 1 root other 10 Apr 10 10:43 PIL -> ../PIL_0_3
lrwxrwxrwx 1 mts mts 10 Apr 10 10:43 Pmw -> ../Pmw_0_7
lrwxrwxrwx 1 root other 10 Apr 10 10:43 v3d -> ../v3d_1_1
Following up on some IRC discussion with other folks:
There is precedent (Plone) for PyPI trove classifiers corresponding to
particular versions of a framework. So I'd like to get feedback on the idea
of expanding that, particularly in the case of Django.
The rationale here is that the ecosystem of Django-related packages is
quite large, but -- as I know all too well from a project I'm working on
literally at this moment -- it can be difficult to ensure that all of one's
dependencies are compatible with the version of Django one happens to be
Adding trove classifier support at the level of individual versions of
Django would, I think, greatly simplify this: tools could easily analyze
which packages are compatible with an end user's chosen version, there'd be
far less manual guesswork, etc., and the rate of creation of new
classifiers would be relatively low (we tend to have one X.Y release/year
or thereabouts, and that's the level of granularity needed).
Assuming there's consensus around the idea of doing this, what would be the
correct procedure for getting such classifiers set up and maintained?
Prompted by a few posts I read recently about the current state of the
Python packaging ecosystem, I figured it made sense to put together an
article summarising my own perspective on the current state of things:
It's pretty long, so the short version tailored specifically for the
distutils-sig audience would be:
* restating the point that pip & conda solve different problems, so
while there's some overlap in their core capabilities, neither is a
substitute for the other
* we've actually managed to put some pretty hard problems behind us in
the last few years, so my thanks to everyone that's played a part in
* some of the thorniest problems that still remain really do require
proper funding of the core ecosystem infrastructure (most notably
PyPI), which means that either commercial redistributors need to step
up and handle the problem on behalf of their customers, or else the
PSF needs to figure out alternative sources of funding (with "let's do
both!" really being my preferred outcome on that front)
P.S. For the Twitter users amongst you, feel free to pass along my
link to the article:
Nick Coghlan | ncoghlan(a)gmail.com | Brisbane, Australia
I've just released version 0.2.4 of distlib on PyPI . For newcomers,
distlib is a library of packaging functionality which is intended to be
usable as the basis for third-party packaging tools.
The main changes in this release are as follows:
* Updated to not fail during import if SSL is not available.
* Changed project name comparisons to follow PEP 503.
* Changed manifest and resources logic to work correctly under (upcoming) Python 3.6.
* Updated Windows launchers with fixes to bugs related to argument-passing in shebang lines.
A more detailed change log is available at .
Please try it out, and if you find any problems or have any suggestions for
improvements, please give some feedback using the issue tracker! 
I'm not sure how to interpret PEP 425 when it comes to packages containing C extension modules that use the limited Python API (PEP 384).
The Python v3.4 version of the limited API is used so should the compatibility tag be...
That would mean that the 'cp34' Python tag would have to be interpreted as a minimum (rather than exact) requirement.
Grateful for any clarification.
The packaging tools generally support 2.6+ and 3.(2|3)+ and that's sort of been
where they've been at for a while now. I would like to think about what we need
to be to start considering Python 2.6 as "too old" to support. In pip we
generally follow a usage based deprecation/removal of supported Pythons but we
don't have any real guidelines for when something is at a low enough usage to
consider it no longer supported and we instead just sort of wait until someone
makes a case that it's "low enough".
This issue tends to impact more than just pip, because once pip drops support
for something people tend to start dropping it across the entire ecosystem and
use pip's no longer supporting it as justification for doing so.
I would like to take a look at Python 2.6 and try and figure out if we're at a
point that we can deprecate and drop it, and if not what is such a point.
Looking at pure usage numbers for "modern" versions of pip (6, 7, and 8) for
downloading from PyPI I see the usage is ~3% of downloads are via Python 2.6.
The only thing lower than Python 2.6 that is still supported is Python 3.3.
Python 2.6 itself has been EOL since 2013-10-29 which is now just about 3 years
ago. It's SSL module is not generally secure and requires the use of additional
installed modules to get it to be so. I believe the only place to get a
Python 2.6 that is "supported" is through the Enterprise-y Linux Distributions
Do we think that a ~3% usage of Python 2.6 and being end-of-life'd for ~3 years
is enough to start deprecating and dropping 2.6? If not what sort of threshold
do we think is enough? It'd be nice to get the albatross of Python 2.6 support
off from around our necks but I'm not sure how others feel. Obviously all of
the existing versions of all of the tooling will still be fully functional so
Python 2.6 users will simply need to not upgrade their tooling to continue to
work, *but* it also means that they will be left out of new packaging features
(and likewise, people can't rely on them if they still wish to support 2.6).
Thanks so much for all of your work on this.
I know it's an old thread, but I've built a tool that generates some
statistics from this information. It queries data for a given list of
projects, or for all of a specific user's projects, caches the data locally
on disk, and generates both a HTML report with a bunch of graphs, as well
as download badges. For my own use, I'm running it cron'ed from my desktop
once a night, and uploading the reports and badges to a public Amazon S3
The project is: https://pypi.org/project/pypi-download-stats/
It's a bit rough around the edges, and currently doesn't have any unit
tests - my hope is that this will be an interim solution until Warehouse
has built-in stats, but I'd be happy to polish it up a bit as time allows
if anyone finds it useful.
Side note for Donald: It appears that the dataset currently contains data
for 2016-01-22 to 2016-03-06 and 2016-05-22 to current. Is there any plan
or possibility of backfilling either the 2016-03-07 to 2016-05-21 gap, or
the older data?
I just recently downloaded Python 3.5 and cannot seem to install any packages like Numpy, etc. I have tried all the instructions on the website and keep getting errors:
For example, when I type "python -m pip install Numpy" it returns a Syntax Error. I am completely new to Python so I must be missing something here - I haven't altered any files since installing it the other day. Do I use the Python IDLE Shell? Are there other packages I need to install first? Any help would be greatly appreciated.
Quantitative Risk Analyst, Banking Officer
Capital Analytics & Stress Testing
2000 McKinney Avenue, Suite 700
Dallas, TX 75201
If you are not the addressee and have received this email in error, please notify me immediately. This email is confidential and may contain privileged or proprietary information that is unlawful for you to read, copy, distribute, disclose or otherwise use in any way.