I've been aware that the distutils sig has been simmerring away, but
until recently it has not been directly relevant to what I do.
I like the look of the proposed api, but have one question. Will this
support an installed system that has multiple versions of the same
package installed simultaneously? If not, then this would seem to be a
significant limitation, especially when dependencies between packages
Assuming it does, then how will this be achieved? I am presently
managing this with a messy arrangement of symlinks. A package is
installed with its version number in it's name, and a separate
directory is created for an application with links from the
unversioned package name to the versioned one. Then I just set the
pythonpath to this directory.
A sample of what the directory looks like is shown below.
I'm sure there is a better solution that this, and I'm not sure that
this would work under windows anyway (does windows have symlinks?).
So, has this SIG considered such versioning issues yet?
Tim Docker timd(a)macquarie.com.au
Quantative Applications Division
qad16:qad $ ls -l lib/python/
drwxr-xr-x 2 mts mts 512 Nov 11 11:23 1.1
-r--r----- 1 root mts 45172 Sep 1 1998 cdrmodule_0_7_1.so
drwxr-xr-x 2 mts mts 512 Sep 1 1998 chart_1_1
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Fnorb_0_7_1
dr-xr-x--- 3 mts mts 512 Nov 11 11:21 Fnorb_0_8
drwxr-xr-x 3 mts mts 1536 Mar 3 12:45 mts_1_1
dr-xr-x--- 7 mts mts 512 Nov 11 11:22 OpenGL_1_5_1
dr-xr-x--- 2 mts mts 1024 Nov 11 11:23 PIL_0_3
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Pmw_0_7
dr-xr-x--- 2 mts mts 512 Nov 11 11:21 v3d_1_1
qad16:qad $ ls -l lib/python/1.1
lrwxrwxrwx 1 root other 29 Apr 10 10:43 _glumodule.so -> ../OpenGL_1_5_1/_glumodule.so
lrwxrwxrwx 1 root other 30 Apr 10 10:43 _glutmodule.so -> ../OpenGL_1_5_1/_glutmodule.so
lrwxrwxrwx 1 root other 22 Apr 10 10:43 _imaging.so -> ../PIL_0_3/_imaging.so
lrwxrwxrwx 1 root other 36 Apr 10 10:43 _opengl_nummodule.so -> ../OpenGL_1_5_1/_opengl_nummodule.so
lrwxrwxrwx 1 root other 27 Apr 10 10:43 _tkinter.so -> ../OpenGL_1_5_1/_tkinter.so
lrwxrwxrwx 1 mts mts 21 Apr 10 10:43 cdrmodule.so -> ../cdrmodule_0_7_1.so
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 chart -> ../chart_1_1
lrwxrwxrwx 1 root other 12 Apr 10 10:43 Fnorb -> ../Fnorb_0_8
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 mts -> ../mts_1_1
lrwxrwxrwx 1 root other 15 Apr 10 10:43 OpenGL -> ../OpenGL_1_5_1
lrwxrwxrwx 1 root other 33 Apr 10 10:43 opengltrmodule.so -> ../OpenGL_1_5_1/opengltrmodule.so
lrwxrwxrwx 1 root other 33 Apr 10 10:43 openglutil_num.so -> ../OpenGL_1_5_1/openglutil_num.so
lrwxrwxrwx 1 root other 10 Apr 10 10:43 PIL -> ../PIL_0_3
lrwxrwxrwx 1 mts mts 10 Apr 10 10:43 Pmw -> ../Pmw_0_7
lrwxrwxrwx 1 root other 10 Apr 10 10:43 v3d -> ../v3d_1_1
I am attempting to build statically linked distributions.
I am using docker containers to ensure the
deployment environment matches the build environment so there is no
Is there any way to force static linking so that wheels can be installed
into a virtual env without requiring specific packages on the host?
In order to claim a package as being abandoned it should undergo a
formal process that includes:
* Placement on a PUBLIC list of packages under review for a grace
period to be determined by this discussion
* Formal attempts via email and social media (twitter, github, et al)
to contact the maintainer.
* Investigation of the claimant for the rights to the package. The
parties attempting to claim a package may not be the best
representatives of the community behind that package, or the Python
community in general.
* Non-reply does not equal consent.
* Access to a commonly (or uncommonly) used package poses security and
I could claim ownership of the redis package, providing a
certain-to-fail email for the maintainers of PyPI to investigate?
Right now the process leads me to think I would succeed in gaining
access. If successful, I would gain complete access to a package used
by hundreds of projects for persistence storage.
I could claim ownership of the redis package, while Andy McCurdy
(maintainer) was on vacation for two weeks, or sabbatical for six
weeks. Again, I would gain access because under the current system
non-reply equals consent.
In ticket #407 (https://sourceforge.net/p/pypi/support-requests/407/)
someone who does not appear to be vetted managed to gain control of
the (arguably) abandoned but still extremely popular
django-registration on PyPI. They run one of several HUNDRED forks of
django-registration, one that is arguably not the most commonly used.
My concern is that as django-registration is the leading package for
handling system registration for Python's most popular web framework,
handing it over without a full investigation of not just the current
maintainer but also the candidate maintainer is risky.
I was curious what others do for the following packaging tasks, or if
you have any recommendations otherwise. There is also a code
organization question at the end.
1) For starters, it's very easy to make mistakes in one's MANIFEST.in,
so I hacked the sdist command in my setup.py to list the differences
between one's project repo and the generated sdist each time you run
"python setup.py sdist". Are there better solutions for this out
there so I don't have to rely on my own hack?
2) Secondly, like many, my README files are in markdown, so I hacked
a command in my setup.py to use Pandoc to convert README.md to a .rst
file for use as the long_description argument to setup(). I also
check in the resulting file for troubleshooting purposes, etc. Are
there more elegant solutions for this that people know of?
Also, for commands like the latter, is it better to define them in
one's setup.py, or simply to have separate scripts in one's repo?
Lastly, as these setup-related tasks grow larger and more complicated,
I found it helped to break them out into a separate setup package that
sits alongside my project's main package library (and even adding
tests in some cases). Is this normal? Have other people run into
I was wondering what is the recommended approach to bundling runtime dll dependencies when using wheels.
We are migrating from egg to wheels for environment installation and of various python dependencies.
Some of those have extension modules, and some have extension modules that depend on the presence
of a third party dll (in our situation, libzmq-v100-mt-4_0_3.dll).
Up to now, these dlls have been installed by the use of the scripts parameter in the setup command of setup.py, but
points to it as not being a good idea.
But the only way to get a dependent dll found on windows is to have it on PATH, and the scripts directory on
windows is on path when a virtualenv is activated.
I have observed two situations:
1) If we use pip wheel to build the wheel, the scripts parameter is ignored and the dlls do not even get to the archive.
2) If we use setup.py bdist_wheel, the dll gets into the archive, but this relies on the non-documented feature of packaging scripts-as-data of dlls.
What is the correct approach at this time ?
I'll post this on the various other lists later, but I promised distutils-sig first taste, especially since the discussion has been raging for a few days (if you're following the setuptools repo, you may already know, but let me take the podium for a few minutes anyway :) )
Microsoft has released a compiler package for Python 2.7 to make it easier for people to build and distribute their C extension modules on Windows. The Microsoft Visual C++ Compiler for Python 2.7 (a.k.a. VC9) is available from: http://aka.ms/vcpython27
This package contains all the tools and headers required to build C extension modules for Python 2.7 32-bit and 64-bit (note that some extension modules require 3rd party dependencies such as OpenSSL or libxml2 that are not included). Other versions of Python built with Visual C++ 2008 are also supported, so "Python 2.7" is just advertising - it'll work fine with 2.6 and 3.2.
You can install the package without requiring administrative privileges and, with the latest version of setuptools (from the source repo - there's no release yet), use tools such as pip, wheel, or a setup.py file to produce binaries on Windows.
The license prevents redistribution of the package itself (obviously you can do what you like with the binaries you produce) and IANAL but there should be no restriction on using this package on automated build systems under the usual one-developer rule (http://stackoverflow.com/a/779631/891 - in effect, the compilers are licensed to one user who happens to be using it on a remote machine).
My plan is to keep the download link stable so that automated scripts can reference and install the package. I have no idea how long that will last... :)
Our intent is to heavily focus on people using this package to produce wheels rather than trying to get this onto every user machine. Binary distribution is the way Windows has always worked and we want to encourage that, though we do also want people to be able to unblock themselves with these compilers.
I should also point out that VC9 is no longer supported by Microsoft. This means there won't be any improvements or bug fixes coming, and there's no official support offered. Feel free to contact me directly <steve.dower(a)microsoft.com> if there are issues with the package.
Right now the “canonical” page for a particular project on PyPI is whatever the
author happened to name their package (e.g. Django). This requires PyPI to have
some "smarts" so that it can redirect things like /simple/django/ to
/simple/Django/ otherwise someone doing ``pip install django`` would fall back
to a much worse behavior.
If this redirect doesn't happen, then pip will issue a request for just
/simple/ and look for a link that, when both sides are normalized, compares
equal to the name it's looking for. It will then follow the link, get
/simple/Django/ and everything works... Except it doesn't. The problem here
comes from the external link classification that we have now. Pip sees the
link to /simple/Django/ as an external link (because it lacks the required
rels) and the installation finally fails.
The /simple/ case rarely happens when installing from PyPI itself because of
the redirect, however it happens quite often when someone is attempting to
instal from a mirror instead. Even when everything works correctly the penality
for not knowing exactly what name to type in results in at least 1 extra http
request, one of which (/simple/) requires pulling down a 2.1MB file.
To fix this I'm going to modify PyPI so that it uses the normalized name in
the /simple/ URL and redirects everything else to the non-normalized name. I'm
also going to submit a PR to bandersnatch so that it will use normalized names
for it's directories and such as well. These two changes will make it so that
the client side will know ahead of time exactly what form the server expects
any given name to be in. This will allow a change in pip to happen which
will pre-normalize all names which will make the interaction with mirrors better
and will reduce the number of HTTP requests that a single ``pip install`` needs
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
I'd like to discuss the idea of moving PyPI to having immutable files. This
would mean that once you publish a particular file you can never reupload that
file again with different contents. This would still allow deleting the file or
reuploading it if the checksums match what was there prior.
This would be good for a few reasons:
* It represents "best practices" for version numbers. Ideally if two people
have version "2.1" of a project, they'll have the same code, however as it
stands two people installing at two different times could have two very
* This will make improving the PyPI infrastructure easier, in particular it
will make it simpler to move away from using a glusterfs storage array and
switch to a redudant set of cloud object stores.
In the past this was brought up and a few points were brought against it, those
1. That authors could simply change files that were hosted on not PyPI anyways
so it didn't really do much.
2. That it was too hard to test a release prior to uploading it due to the
nature of distutils requiring you to build the release in the same command
as the upload.
With the fact that pip no longer hits external URLs by default, I believe that
the first item is no longer that large of a factor. People can do whatever they
want on external URLs of course, however if something is coming from PyPI as
end users should now be aware of, they can know it is immutable.
Now that there is twine, which allows uploading already created packages, I
also believe that the second item is no longer a concern. People can easily
create a distribution using ``setup.py sdist``, test it, and then upload that
exact thing they tested using ``twine upload <path to sdist>``.
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
Can anyone give me some advice, please? I am trying to build
extensions on Windows 64-bit, using the free Windows SDK compilers.
But I can't find any official documentation on how to do this, and
everything I have tried so far has failed in frustrating ways. I'm now
at the point where I appear to be hitting the following bug -
http://bugs.python.org/issue7511 which has stumped me completely.
Sadly, as is typical with distutils issues, this one seems to have
been round for years and there is little or no sign that anyone is
willing to fix it.
Two questions, really:
* Is there any intention that building extensions with the SDK
compilers is supported?
* How do I do it, if so?
Personally, this is of limited relevance, as I have the full version
of MSVC available. But I'm trying to put together some documentation
for package developers on how to build Windows wheels, in particular
using Appveyor to automate the process, with the intention that people
shouldn't have to jump through hoops to provide wheels, but should
rather be able to simply use a prebuilt recipe to automate the
As an alternative, I wonder whether Microsoft would be willing to
support Appveyor by providing them with access to the full version of
MSVC (2008 and 2010) for the build workers? Steve - do you know if
there's any possibility of something like that?