Hi again --
[cc'd to Paul Dubois: you said you weren't following the distutils sig
anymore, but this directly concerns NumPy and I'd like to get your
input!]
here's that sample setup.py for NumPy. See below for discussion (and
questions!).
------------------------------------------------------------------------
#!/usr/bin/env python
# Setup script example for building the Numeric extension to Python.
# This does sucessfully compile all the .dlls. Nothing happens
# with the .py files currently.
# Move this file to the Numerical directory of the LLNL numpy
# distribution and run as:
# python numpysetup.py --verbose build_ext
#
# created 1999/08 Perry Stoll
__rcsid__ = "$Id: numpysetup.py,v 1.1 1999/09/12 20:42:48 gward Exp $"
from distutils.core import setup
setup (name = "numerical",
version = "0.01",
description = "Numerical Extension to Python",
url = "http://www.python.org/sigs/matrix-sig/",
ext_modules = [ ( '_numpy', { 'sources' : [ 'Src/_numpymodule.c',
'Src/arrayobject.c',
'Src/ufuncobject.c'
],
'include_dirs' : ['./Include'],
'def_file' : 'Src/numpy.def' }
),
( 'multiarray', { 'sources' : ['Src/multiarraymodule.c'],
'include_dirs' : ['./Include'],
'def_file': 'Src/multiarray.def'
}
),
( 'umath', { 'sources': ['Src/umathmodule.c'],
'include_dirs' : ['./Include'],
'def_file' : 'Src/umath.def' }
),
( 'fftpack', { 'sources': ['Src/fftpackmodule.c', 'Src/fftpack.c'],
'include_dirs' : ['./Include'],
'def_file' : 'Src/fftpack.def' }
),
( 'lapack_lite', { 'sources' : [ 'Src/lapack_litemodule.c',
'Src/dlapack_lite.c',
'Src/zlapack_lite.c',
'Src/blas_lite.c',
'Src/f2c_lite.c'
],
'include_dirs' : ['./Include'],
'def_file' : 'Src/lapack_lite.def' }
),
( 'ranlib', { 'sources': ['Src/ranlibmodule.c',
'Src/ranlib.c',
'Src/com.c',
'Src/linpack.c',
],
'include_dirs' : ['./Include'],
'def_file' : 'Src/ranlib.def' }
),
]
)
------------------------------------------------------------------------
First, what d'you think? Too clunky and verbose? Too much information
for each extension? I kind of think so, but I'm not sure how to reduce
it elegantly. Right now, the internal data structures needed to compile
a module are pretty obviously exposed: is this a good thing? Or should
there be some more compact form for setup.py that will be expanded later
into the full glory we see above?
I've already made one small step towards reducing the amount of cruft by
factoring 'include_dirs' out and supplying it directly as a parameter to
'setup()'. (But that needs code not in the CVS archive yet, so I've
left the sample setup.py the same for now.)
The next thing I'd like to do is get that damn "def_file" out of there.
To support it in MSVCCompiler, there's already an ugly hack that
unnecessarily affects both the UnixCCompiler and CCompiler classes, and
I want to get rid of that. (I refer to passing the 'build_info'
dictionary into the compiler classes, if you're familiar with the code
-- that dictionary is part of the Distutils extension-building system,
and should not propagate into the more general compiler classes.)
But I don't want to give these weird "def file" things standing on the
order of source files, object files, libraries, etc., because they seem
to me to be a bizarre artifact of one particular compiler, rather than
something present in a wide range of C/C++ compilers.
Based on the NumPy model, it seems like there's a not-too-kludgy way to
handle this problem. Namely:
if building extension "foo":
if file "foo.def" found in same directory as "foo.c"
add "/def:foo.def" to MSVC command line
this will of course require some platform-specific code in the build_ext
command class, but I figured that was coming eventually, so why put it
off? ;-)
To make this hack work with NumPy, one change would be necessary: rename
Src/numpy.def to Src/_numpy.def to match Src/_numpy.c, which implements
the _numpy module. Would this be too much to ask of NumPy? (Paul?)
What about other module distributions that support MSVC++ and thus ship
with "def" files? Could they be made to accomodate this scheme?
Thanks for your feedback --
Greg
--
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
Hi all --
at long last, I found the time to hack in the ability to compile
extension modules to the Distutils. Mainly, this meant adding a
'build_ext' command which uses a CCompiler instance for all its dirty
work. I also had to add a few methods to CCompiler (and, of course,
UnixCCompiler) to make this work.
And I added a new module, 'spawn', which takes care of running
sub-programs more efficiently and robustly (no shell involved) than
os.system. That's needed, obviously, so we can run the compiler!
If you're in the mood for grubbing over raw source code, then get the
latest from CVS or download a current snapshot. See
http://www.python.org/sigs/distutils-sig/implementation.html
for a link to the code snapshot.
I'm still waiting for more subclasses of CCompiler to appear. At the
very least, we're going to need MSVCCompiler to build extensions on
Windows. Any takers? Also, someone who knows the Mac, and how to run
compilers programmatically there, will have to figure out how to write a
Mac-specific concrete CCompiler subclass.
The spawn module also needs a bit of work to be portable. I suspect
that _win32_spawn() (the intended analog to my _posix_spawn()) will be
easy to implement, if it even needs to go in a separate function at all.
It looks from the Python Library documentation for 1.5.2 that the
os.spawnv() function is all we need, but it's a bit hard to figure out
just what's needed. Windows wizards, please take a look at the
'spawn()' function and see if you can make it work on Windows.
As for actually compiling extensions: well, if you can figure out the
build_ext command, go ahead and give it a whirl. It's a bit cryptic
right now, since there's no documentation and no example setup.py. (I
have a working example at home, but it's not available online.) If you
feel up to it, though, see if you can read the code and figure out
what's going on. I'm just hoping *I'll* be able to figure out what's
going on when I get back from the O'Reilly conference next week... ;-)
Enjoy --
Greg
--
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
Hi all --
at long last, I have fixed two problems that a couple people noticed a
while ago:
* I folded in Amos Latteier's NT patches almost verbatim -- just
changed an `os.path.sep == "/"' to `os.name == "posix"' and added
some comments bitching about the inadequacy of the current library
installation model (I think this is Python's fault, but for now
Distutils is slavishly aping the situation in Python 1.5.x)
* I fixed the problem whereby running "setup.py install" without
doing anything else caused a crash (because 'build' hadn't yet
been run). Now, the 'install' command automatically runs 'build'
before doing anything; to make this bearable, I added a 'have_run'
dictionary to the Distribution class to keep track of which commands
have been run. So now not only are command classes singletons,
but their 'run' method can only be invoked once -- both restrictions
enforced by Distribution.
The code is checked into CVS, or you can download a snapshot at
http://www.python.org/sigs/distutils-sig/distutils-19990607.tar.gz
Hope someone (Amos?) can try the new version under NT. Any takers for
Mac OS?
BTW, all parties involved in the Great "Where Do We Install Stuff?"
Debate should take a good, hard look at the 'set_final_options()' method
of the Install class in distutils/install.py; this is where all the
policy decisions about where to install files are made. Currently it
apes the Python 1.5 situation as closely as I could figure it out.
Obviously, this is subject to change -- I just don't know to *what* it
will change!
Greg
--
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
Hi all,
I've been aware that the distutils sig has been simmerring away, but
until recently it has not been directly relevant to what I do.
I like the look of the proposed api, but have one question. Will this
support an installed system that has multiple versions of the same
package installed simultaneously? If not, then this would seem to be a
significant limitation, especially when dependencies between packages
are considered.
Assuming it does, then how will this be achieved? I am presently
managing this with a messy arrangement of symlinks. A package is
installed with its version number in it's name, and a separate
directory is created for an application with links from the
unversioned package name to the versioned one. Then I just set the
pythonpath to this directory.
A sample of what the directory looks like is shown below.
I'm sure there is a better solution that this, and I'm not sure that
this would work under windows anyway (does windows have symlinks?).
So, has this SIG considered such versioning issues yet?
Cheers,
Tim
--------------------------------------------------------------
Tim Docker timd(a)macquarie.com.au
Quantative Applications Division
Macquarie Bank
--------------------------------------------------------------
qad16:qad $ ls -l lib/python/
total 110
drwxr-xr-x 2 mts mts 512 Nov 11 11:23 1.1
-r--r----- 1 root mts 45172 Sep 1 1998 cdrmodule_0_7_1.so
drwxr-xr-x 2 mts mts 512 Sep 1 1998 chart_1_1
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Fnorb_0_7_1
dr-xr-x--- 3 mts mts 512 Nov 11 11:21 Fnorb_0_8
drwxr-xr-x 3 mts mts 1536 Mar 3 12:45 mts_1_1
dr-xr-x--- 7 mts mts 512 Nov 11 11:22 OpenGL_1_5_1
dr-xr-x--- 2 mts mts 1024 Nov 11 11:23 PIL_0_3
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Pmw_0_7
dr-xr-x--- 2 mts mts 512 Nov 11 11:21 v3d_1_1
qad16:qad $ ls -l lib/python/1.1
total 30
lrwxrwxrwx 1 root other 29 Apr 10 10:43 _glumodule.so -> ../OpenGL_1_5_1/_glumodule.so
lrwxrwxrwx 1 root other 30 Apr 10 10:43 _glutmodule.so -> ../OpenGL_1_5_1/_glutmodule.so
lrwxrwxrwx 1 root other 22 Apr 10 10:43 _imaging.so -> ../PIL_0_3/_imaging.so
lrwxrwxrwx 1 root other 36 Apr 10 10:43 _opengl_nummodule.so -> ../OpenGL_1_5_1/_opengl_nummodule.so
lrwxrwxrwx 1 root other 27 Apr 10 10:43 _tkinter.so -> ../OpenGL_1_5_1/_tkinter.so
lrwxrwxrwx 1 mts mts 21 Apr 10 10:43 cdrmodule.so -> ../cdrmodule_0_7_1.so
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 chart -> ../chart_1_1
lrwxrwxrwx 1 root other 12 Apr 10 10:43 Fnorb -> ../Fnorb_0_8
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 mts -> ../mts_1_1
lrwxrwxrwx 1 root other 15 Apr 10 10:43 OpenGL -> ../OpenGL_1_5_1
lrwxrwxrwx 1 root other 33 Apr 10 10:43 opengltrmodule.so -> ../OpenGL_1_5_1/opengltrmodule.so
lrwxrwxrwx 1 root other 33 Apr 10 10:43 openglutil_num.so -> ../OpenGL_1_5_1/openglutil_num.so
lrwxrwxrwx 1 root other 10 Apr 10 10:43 PIL -> ../PIL_0_3
lrwxrwxrwx 1 mts mts 10 Apr 10 10:43 Pmw -> ../Pmw_0_7
lrwxrwxrwx 1 root other 10 Apr 10 10:43 v3d -> ../v3d_1_1
Hello All!
As most people are aware, there has been an effort under way to rewrite PyPI in
order to solve a lot of long standing problems. For those who aren't aware, that
is currently available at https://pypi.org/ and it uses the same database that
"Legacy" PyPI does, so the two are essentially just different views over the
same data.
For a awhile now, Python, setuptools, and twine have all defaulted to using
this new code base for uploading artifacts to PyPI. Now that we've gotten some
testing of that code base, the infrastructure team and myself feel comfortable
directing everyone to using the new endpoint and we're planning on shutting
down uploads to Legacy PyPI.
If you're using the latest versions of Python, setuptools, or twine and you
have not put an explicit URL in your ~/.pypirc file, then there's nothing you
should need to do. If you are not, then you should ideally upgrade to the latest
version of whatever tool you're using to upload (the preferred tool is twine)
and edit your ~/.pypirc so that it removes any explicit mention of an URL. Thus
it should look something like:
[distutils]
index-servers =
pypi
[pypi]
username:yourusername
password:yourpassword
If for some reason you're not able to update to the newest version of your
upload tool, then you can configure it to upload to the new code base by
switching the URL to use https://upload.pypi.org/legacy/ instead of
https://pypi.python.org/pypi. Thus your ~/.pypirc would then become:
[distutils]
index-servers =
pypi
[pypi]
repository:https://upload.pypi.org/legacy/
username:yourusername
password:yourpassword
For those of you who are using TestPyPI, that will also be affected, and the
required URL for the new upload endpoint for TestPyPI is
https://test.pypi.org/legacy/.
We plan to disable uploads to legacy PyPI on July 3rd, 2017 so any configuration
change will need to be made before that date. In addition, we plan to have a
"brownout" on June 29th where we will shut the legacy endpoint down for that
day.
For TestPyPI the change to disable uploads to legacy will be made in the next
couple of days, likely this weekend.
As part of the error message that users will get when attempting to upload to
legacy PyPI, we will include a link to a page that details how to ensure that
they are using the new code base and not the legacy code base.
Thanks everyone!
—
Donald Stufft
I know it was deprecated long ago in favor of readthedocs but I kept
postponing it and my doc is still hosted on https://pythonhosted.org/psutil/
.
I just took a look at:
https://pypi.python.org/pypi?:action=pkg_edit&name=psutil
...and the form to upload the doc in zip format is gone.
The only available option is the "destroy documentation" button.
I would like to at least upload a static HTML page redirecting to the new
doc which I will soon move on readthedocs. Is that possible?
Thanks in advance.
--
Giampaolo - http://grodola.blogspot.com
Hi,
Is it possible to create a documentation in the section with “Project Links” on the page describing a release on PyPI.org <http://pypi.org/>? That is, can I add some metadata to an sdist or wheel that would add an entry to that list, and if so how? I’m using setuptools to create the distribution artefacts.
The reason I ask is that I currently use an index.html on pythonhosted.org that redirects to the real documentation because that adds a link to the documentation when looking at the release on pypi.python.org. That doesn’t work pypi.org.
Ronald
Some time ago, I started the process [1] of adjusting how
distutils-sig uses the PEP process so that the reference
specifications will live on packaging.python.org, and we use the PEP
process to manage *changes* to those specifications, rather than
serving as the specifications themselves (that is, adopting a process
closer to the URL-centric way the Python language reference is
managed, rather than using the RFCstyle PEP-number-centric model the
way we do now).
I never actually finished that work, and as a result, it's currently
thoroughly unclear [2] that Description-Content-Type and
Provides-Extra are defined at
https://packaging.python.org/specifications/#core-metadata rather than
in a PEP.
I'm currently at the CPython core development sprint in San Francisco,
and I'm thinking that finalising that migration [3] and updating the
affected PEPs accordingly (most notably, PEP 345) is likely to be a
good use of my time.
However, I'm also wondering if it may still be worthwhile writing a
metadata 1.3 PEP that does the following things:
1. Explicitly notes the addition of the two new fields
2. Describes the process change for packaging interoperability specifications
3. Defines a canonical transformation between the human-readable
key:value format and a more automation friendly JSON format
That PEP would then essentially be the first one to use the new
process: it would supersede PEP 345 as the latest metadata
specification, but it would *also* redirect readers to the relevant
URL on packaging.python.org as the canonical source of the
specification, rather than being the reference documentation in its
own right.
Cheers,
Nick.
[1] https://github.com/pypa/pypa.io/issues/11#issuecomment-173833061
[2] https://github.com/python/peps/issues/388
P.S. Daniel, if you're currently thinking "I proposed defining an
incremental metadata 1.3 tweak years ago!", aye, you did. My
subsequent additions to PEP 426 were a classic case of second-system
syndrome: https://en.wikipedia.org/wiki/Second-system_effect (which we
suspected long ago, hence that PEP's original deferral)
Fortunately, the disciplining effect of working with a primarily
volunteer contributor base has prevented my over-engineered
change-for-change's-sake-rather-than-to-solve-actual-user-problems
version from becoming reality ;)
--
Nick Coghlan | ncoghlan(a)gmail.com | Brisbane, Australia
Hello there,
I'm going to ask questions about Reproducible Builds, a previous
thread have been started in March[1], but does not cover some of the
questions I have.
In particular I'm interested in the reproducible build of an _sdist_.
That is to say the process of going from a given commit to the
corresponding TGZ file. It is my understanding that setting
SOURCE_DATE_EPOCH (SDE for short) should allow a reproducible building
of an Sdist;
And by reproducible I mean that the tgz itself is the same byte for
byte; (the unpacked-content being the same is a weaker form I'm less
interested in).
Is this assumption correct?
In particular I cannot seem to be able to do that without unpacking
and repacking the tgz myself; because the copy_tree-taring and the
gziping by default embed the current timestamp of when these functions
were ran. Am I missing something ?
Second; is there a convention to store the SDE value ? I don't seem to
be able to find one. It is nice to have reproducible build; but if
it's a pain for reproducers to find the SDE value that highly decrease
the value of SDE build.
Also congrats for pep 517 and thanks for everyone who participated;
Thanks
--
Matthias
1: https://mail.python.org/pipermail/distutils-sig/2017-March/030284.html