Hi all --
at long last, I have fixed two problems that a couple people noticed a
* I folded in Amos Latteier's NT patches almost verbatim -- just
changed an `os.path.sep == "/"' to `os.name == "posix"' and added
some comments bitching about the inadequacy of the current library
installation model (I think this is Python's fault, but for now
Distutils is slavishly aping the situation in Python 1.5.x)
* I fixed the problem whereby running "setup.py install" without
doing anything else caused a crash (because 'build' hadn't yet
been run). Now, the 'install' command automatically runs 'build'
before doing anything; to make this bearable, I added a 'have_run'
dictionary to the Distribution class to keep track of which commands
have been run. So now not only are command classes singletons,
but their 'run' method can only be invoked once -- both restrictions
enforced by Distribution.
The code is checked into CVS, or you can download a snapshot at
Hope someone (Amos?) can try the new version under NT. Any takers for
BTW, all parties involved in the Great "Where Do We Install Stuff?"
Debate should take a good, hard look at the 'set_final_options()' method
of the Install class in distutils/install.py; this is where all the
policy decisions about where to install files are made. Currently it
apes the Python 1.5 situation as closely as I could figure it out.
Obviously, this is subject to change -- I just don't know to *what* it
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
I've been aware that the distutils sig has been simmerring away, but
until recently it has not been directly relevant to what I do.
I like the look of the proposed api, but have one question. Will this
support an installed system that has multiple versions of the same
package installed simultaneously? If not, then this would seem to be a
significant limitation, especially when dependencies between packages
Assuming it does, then how will this be achieved? I am presently
managing this with a messy arrangement of symlinks. A package is
installed with its version number in it's name, and a separate
directory is created for an application with links from the
unversioned package name to the versioned one. Then I just set the
pythonpath to this directory.
A sample of what the directory looks like is shown below.
I'm sure there is a better solution that this, and I'm not sure that
this would work under windows anyway (does windows have symlinks?).
So, has this SIG considered such versioning issues yet?
Tim Docker timd(a)macquarie.com.au
Quantative Applications Division
qad16:qad $ ls -l lib/python/
drwxr-xr-x 2 mts mts 512 Nov 11 11:23 1.1
-r--r----- 1 root mts 45172 Sep 1 1998 cdrmodule_0_7_1.so
drwxr-xr-x 2 mts mts 512 Sep 1 1998 chart_1_1
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Fnorb_0_7_1
dr-xr-x--- 3 mts mts 512 Nov 11 11:21 Fnorb_0_8
drwxr-xr-x 3 mts mts 1536 Mar 3 12:45 mts_1_1
dr-xr-x--- 7 mts mts 512 Nov 11 11:22 OpenGL_1_5_1
dr-xr-x--- 2 mts mts 1024 Nov 11 11:23 PIL_0_3
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Pmw_0_7
dr-xr-x--- 2 mts mts 512 Nov 11 11:21 v3d_1_1
qad16:qad $ ls -l lib/python/1.1
lrwxrwxrwx 1 root other 29 Apr 10 10:43 _glumodule.so -> ../OpenGL_1_5_1/_glumodule.so
lrwxrwxrwx 1 root other 30 Apr 10 10:43 _glutmodule.so -> ../OpenGL_1_5_1/_glutmodule.so
lrwxrwxrwx 1 root other 22 Apr 10 10:43 _imaging.so -> ../PIL_0_3/_imaging.so
lrwxrwxrwx 1 root other 36 Apr 10 10:43 _opengl_nummodule.so -> ../OpenGL_1_5_1/_opengl_nummodule.so
lrwxrwxrwx 1 root other 27 Apr 10 10:43 _tkinter.so -> ../OpenGL_1_5_1/_tkinter.so
lrwxrwxrwx 1 mts mts 21 Apr 10 10:43 cdrmodule.so -> ../cdrmodule_0_7_1.so
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 chart -> ../chart_1_1
lrwxrwxrwx 1 root other 12 Apr 10 10:43 Fnorb -> ../Fnorb_0_8
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 mts -> ../mts_1_1
lrwxrwxrwx 1 root other 15 Apr 10 10:43 OpenGL -> ../OpenGL_1_5_1
lrwxrwxrwx 1 root other 33 Apr 10 10:43 opengltrmodule.so -> ../OpenGL_1_5_1/opengltrmodule.so
lrwxrwxrwx 1 root other 33 Apr 10 10:43 openglutil_num.so -> ../OpenGL_1_5_1/openglutil_num.so
lrwxrwxrwx 1 root other 10 Apr 10 10:43 PIL -> ../PIL_0_3
lrwxrwxrwx 1 mts mts 10 Apr 10 10:43 Pmw -> ../Pmw_0_7
lrwxrwxrwx 1 root other 10 Apr 10 10:43 v3d -> ../v3d_1_1
As most people are aware, there has been an effort under way to rewrite PyPI in
order to solve a lot of long standing problems. For those who aren't aware, that
is currently available at https://pypi.org/ and it uses the same database that
"Legacy" PyPI does, so the two are essentially just different views over the
For a awhile now, Python, setuptools, and twine have all defaulted to using
this new code base for uploading artifacts to PyPI. Now that we've gotten some
testing of that code base, the infrastructure team and myself feel comfortable
directing everyone to using the new endpoint and we're planning on shutting
down uploads to Legacy PyPI.
If you're using the latest versions of Python, setuptools, or twine and you
have not put an explicit URL in your ~/.pypirc file, then there's nothing you
should need to do. If you are not, then you should ideally upgrade to the latest
version of whatever tool you're using to upload (the preferred tool is twine)
and edit your ~/.pypirc so that it removes any explicit mention of an URL. Thus
it should look something like:
If for some reason you're not able to update to the newest version of your
upload tool, then you can configure it to upload to the new code base by
switching the URL to use https://upload.pypi.org/legacy/ instead of
https://pypi.python.org/pypi. Thus your ~/.pypirc would then become:
For those of you who are using TestPyPI, that will also be affected, and the
required URL for the new upload endpoint for TestPyPI is
We plan to disable uploads to legacy PyPI on July 3rd, 2017 so any configuration
change will need to be made before that date. In addition, we plan to have a
"brownout" on June 29th where we will shut the legacy endpoint down for that
For TestPyPI the change to disable uploads to legacy will be made in the next
couple of days, likely this weekend.
As part of the error message that users will get when attempting to upload to
legacy PyPI, we will include a link to a page that details how to ensure that
they are using the new code base and not the legacy code base.
Is it possible to create a documentation in the section with “Project Links” on the page describing a release on PyPI.org <http://pypi.org/>? That is, can I add some metadata to an sdist or wheel that would add an entry to that list, and if so how? I’m using setuptools to create the distribution artefacts.
The reason I ask is that I currently use an index.html on pythonhosted.org that redirects to the real documentation because that adds a link to the documentation when looking at the release on pypi.python.org. That doesn’t work pypi.org.
Some time ago, I started the process  of adjusting how
distutils-sig uses the PEP process so that the reference
specifications will live on packaging.python.org, and we use the PEP
process to manage *changes* to those specifications, rather than
serving as the specifications themselves (that is, adopting a process
closer to the URL-centric way the Python language reference is
managed, rather than using the RFCstyle PEP-number-centric model the
way we do now).
I never actually finished that work, and as a result, it's currently
thoroughly unclear  that Description-Content-Type and
Provides-Extra are defined at
https://packaging.python.org/specifications/#core-metadata rather than
in a PEP.
I'm currently at the CPython core development sprint in San Francisco,
and I'm thinking that finalising that migration  and updating the
affected PEPs accordingly (most notably, PEP 345) is likely to be a
good use of my time.
However, I'm also wondering if it may still be worthwhile writing a
metadata 1.3 PEP that does the following things:
1. Explicitly notes the addition of the two new fields
2. Describes the process change for packaging interoperability specifications
3. Defines a canonical transformation between the human-readable
key:value format and a more automation friendly JSON format
That PEP would then essentially be the first one to use the new
process: it would supersede PEP 345 as the latest metadata
specification, but it would *also* redirect readers to the relevant
URL on packaging.python.org as the canonical source of the
specification, rather than being the reference documentation in its
P.S. Daniel, if you're currently thinking "I proposed defining an
incremental metadata 1.3 tweak years ago!", aye, you did. My
subsequent additions to PEP 426 were a classic case of second-system
syndrome: https://en.wikipedia.org/wiki/Second-system_effect (which we
suspected long ago, hence that PEP's original deferral)
Fortunately, the disciplining effect of working with a primarily
volunteer contributor base has prevented my over-engineered
version from becoming reality ;)
Nick Coghlan | ncoghlan(a)gmail.com | Brisbane, Australia
I'm going to ask questions about Reproducible Builds, a previous
thread have been started in March, but does not cover some of the
questions I have.
In particular I'm interested in the reproducible build of an _sdist_.
That is to say the process of going from a given commit to the
corresponding TGZ file. It is my understanding that setting
SOURCE_DATE_EPOCH (SDE for short) should allow a reproducible building
of an Sdist;
And by reproducible I mean that the tgz itself is the same byte for
byte; (the unpacked-content being the same is a weaker form I'm less
Is this assumption correct?
In particular I cannot seem to be able to do that without unpacking
and repacking the tgz myself; because the copy_tree-taring and the
gziping by default embed the current timestamp of when these functions
were ran. Am I missing something ?
Second; is there a convention to store the SDE value ? I don't seem to
be able to find one. It is nice to have reproducible build; but if
it's a pain for reproducers to find the SDE value that highly decrease
the value of SDE build.
Also congrats for pep 517 and thanks for everyone who participated;
This was a comment by @zooba (Steve Dower):
> (FWIW, I think it makes *much* more sense for setuptools to fix this by
simply forking all of distutils and never looking back. But since we don't
live in that world yet, it went into distutils.)
And here is my response:
> Since you mention it, I agree with that proposal. But currently we have
core developers contributing to distutils and @jaraco contributing to
setuptools. @jaraco is quite competent, but I doubt that he would be able
to maintain an independent fork of distutils by himself.
> In short, I think your proposal is a good one, but how can we allocate
(issue31595 on bugs.python.org)
So what do others think of this? My sense of things is that people are open
to the idea, but there isn't a plan to make it happen.
Discussion on PEP 517 seems to have settled down, and I believe that
Nick said he was about ready to accept it. Is everyone involved
satisfied with the current state? Or is there anything else you think
should be considered before accepting it?