Some packages we maintain currently provide largely identical
side-by-side implementations of features: one implementation is written
in C, the other implementation is written in Python. The C module is
just an optimized version of the Python code.
There is logic at module scope within modules in the same package which
attempts to import the C version. If the C version cannot be imported
(an ImportError is raised), the logic falls back to importing the Python
from . import cmodule as module
from . import pymodule as module
This means that the package can be used if the C implementation is
compiled or not. It will run more slowly, but that's OK in lots of
In our setup.py for these kinds of distributions, we have code which
under Py2 uses a setuptools "feature" and under Py3 registers a
different kind of build_ext which, at install time, will:
- Attempt to compile the C if suitable build tools seem to exist on the
- If suitable build tools don't seem to exist on the target system, will
print a message and continue.
How can we support such a feature in the brave new declarative packaging
I propose an extension to PEP 376 to support the development of projects
- Installation tools should expect to need to cope with files
present in purelib named "foo.dist-link". Each of these files
will be of the form "/path/containing/foo.dist-info\nreldir"
For example, the contents of such a file might be:
Presence of such files would indicate a link to a directory
which contains a .dist-info directory for the project as well
as the subdirectory in the project which actually contains
packages and modules.
- Installation tools should need to cope with a "develop.pth"
file in purelib. Its presence will indicate that there are
projects in development mode in this environment.
Its absence indicates that there are no projects in development
mode in this environment.
- Installer tools should place dist-link files into purelib as the
result of a notional "develop" command e.g.
pysetup develop .
If "pysetup develop ." were run in zope.interface,
zope.interface.dist-link would be installed into purelib
- Installer tools should mutate "develop.pth" as the result of
"develop", e.g. "pysetup develop ." would create this
file if it didn't already exist.
After a "pysetup develop" of zope.interface it might look like:
import pkgutil.develop; pkgutil.develop.save_path_order()
import pkgutil.develop; pkgutil.develop.reorder_sys_path()
The "import" lines would ensure that the paths mentioned in the
.pth file would precede stdlib sys.paths but would come after
paths mentioned on PYTHONPATH envvar. Some sort of hack like
this anyway; maybe Python could offer better support for
this without hackiness of relying on executing code via
a .pth file at startup time.
- "pysetup undevelop ." of zope.interface would undo the changes
made via "pysetup develop"; it would also remove develop.pth
if no other developed projects existed anymore after
zope.interface was undeveloped.
- It will be permissible to have both the "develop" version and
the "install" version of a library installed in the same purelib.
Installer tools should be able to cope with this and display
both versions when they are asked to list installed distributions.
That's the basics.
I was digging through PEP386 & PEP345 today, and I noticed something odd about the wording of PEP345.
When a version is provided, it always includes all versions that starts with the same value. For
example the "2.5" version of Python will include versions like "2.5.2" or "2.5.3". Pre and post
releases in that case are excluded. So in our example, versions like "2.5a1" are not included
when "2.5" is used. If the first version of the range is required, it has to be explicitly given. In
our example, it will be "2.5.0".
It also states:
In that case, "2.5.0" will have to be explicitly used to avoid any confusion between the "2.5"
notation that represents the full range. It is a recommended practice to use schemes of the
same length for a series to completely avoid this problem.
This effectively translates to an inability to pin to an exact version. Even in the case of specifying
== it checks that the version "starts with" the value you selected. So if you pin to "2.5", and the
author then releases "2.5.1", that will count as ==2.5. If you try to then pin to "2.5.0", and the
author releases "22.214.171.124", then that will count as ==2.5.0.
Essentially this translates to:
==2.5 -> >=2.5<2.6
==2.5.0 -> >=2.5.0<2.5.1
==126.96.36.199 -> >=188.8.131.52<184.108.40.206
Which means that version specifiers are _always_ ranges and are never exact versions. The PEP
as written relies on authors to decide beforehand how many digits they are going to use in their
versions, and for them to never increase or decrease that number.
I also checked to see if Distutils2/packaging implemented VersionPredicates that way or if they
allowed specifying an exact version. It turned out that it implements the PEP as written:
>>> from distutils2 import version
>>> predicate = version.VersionPredicate("foo (==2.5)")
>>> print predicate
Why does 1.0a1 sort before 1.0.dev1? It appears to me that common
usage in the wild of dev1 releases is that they are used for the
development version before any sort of alpha, beta, rc, or final has
The latest upload of z3c.recipe.tag appears to be broken. I'm sending this
message to zope-dev(a)zope.org because that's the maintainer_email from the
setup.py, and I can't find any indication of where to submit bug reports. I'm
also CC'ing distutils-sig, just 'cause. :)
Doing a buildout of the Mailman 3 trunk crashes today:
Got z3c.recipe.scripts 1.0.1.
Getting distribution for 'zc.recipe.egg>=1.3.0'.
Got zc.recipe.egg 1.3.2.
Getting distribution for 'z3c.recipe.tag'.
error: /tmp/easy_install-ganUUb/z3c.recipe.tag-0.5/CHANGES.txt: No such file or directory
An error occurred when trying to install z3c.recipe.tag 0.5. Look above this message for any errors that were output by easy_install.
Getting section tags.
Initializing section tags.
Installing recipe z3c.recipe.tag.
Getting distribution for 'z3c.recipe.tag'.
Error: Couldn't install: z3c.recipe.tag 0.5
PyPI says a new version of z3c.recipe.tags was uploaded today (2012-09-06).
It looks like a file was missing from the zip.
I've made what I think is exciting progress on the digital signatures
design for wheel (updated built/binary packages for Python; intended
to replace egg). The insight is that we can overload the "extras"
syntax as a convenient way to mention the public key we expect:
This line in a pip-style requires.txt specifies that we want to
install package, the normal optional dependency "extra" and that we
expect it to have a valid signature made with the mentioned ed25519
Distribute your application, assemble its requirements in this format,
sign that file, and Bob's your uncle -- your users can install from
that file and know that the requirements they download have the same
publishers as the packages you developed with. This is far more
powerful than a file hash because it is valid for more than one
version of the package.
For backwards compatibility, packages can say they provide the extra
ed25519=ouBJlTJJ4SJXoy8Bi1KRlewWLU6JW7HUXTgvU1YRuiA (adding no
dependencies) and easy_install should parse and ignore it, installing
from an unsigned egg or source distribution because it doesn't know
bdist_wheel will be updated to always sign the wheels it generates if
possible, making up a new signing key if one is not stored on the
building machine. That way signed wheels will be commonplace, and the
remaining problem is simply to decide which signing keys you would
like to trust.
wheel now includes a command line tool `wheel sign wheelfile.whl` that
adds a digital signature to a wheel file.
I have only recently joined this list --- hopefully my first mail to this list
*First*, is it possible to place comments in the distutils.cfg file? If yes,
*Second*, I have been unable to install yappi (included in the PyPI list) on my
Windows 7 (64-bit) platform with Python 2.7 (32-bit) and would greatly
appreciate help on this installation.
Note, I have installed many other PyPI packages on this same system with no
problems where I have cygwin, minGW (32-bit) and minGW (64-bit) gcc and g++
compilers installed and working properly. I have installed yappi on my Windows
Vista (32-bit) platform with Python 2.7 and there were no problems. I have also
studied the information posted by Greg Ward at
Here are some of the results of my attempts to install yappi on the Windows 7
I looked at what compilers are available for building from my setup.py
c:\Software\Python\yappi\yappi-0.62>python setup.py build --help-compiler
List of available compilers:
--compiler=bcpp Borland C++ Compiler
--compiler=cygwin Cygwin port of GNU C Compiler for Win32
--compiler=emx EMX port of GNU C Compiler for OS/2
--compiler=mingw32 Mingw32 port of GNU C Compiler for Win32
--compiler=msvc Microsoft Visual C++
--compiler=unix standard UNIX-style compiler
I then tried the following:
c:\Software\Python\yappi\yappi-0.62>python setup.py build --compiler=cygwin
building '_yappi' extension
gcc -mcygwin -mdll -O -Wall -IC:\Python27\include -IC:\Python27\PC -c
_yappi.c -o build\temp.win32-2.7\Release\_yappi.o
error: command 'gcc' failed: No such file or directory
I then tried to be more specific,
c:\Software\Python\yappi\yappi-0.62>python setup.py build
building '_yappi' extension
error: don't don't know how to compile C/C++ code on platform 'nt' with
I also tried many other possible (at least IMHO) solutions --- these also
failed. For example,
c:\Software\Python\yappi\yappi-0.62>python setup.py build --compiler=msvc
building '_yappi' extension
error: Unable to find vcvarsall.bat
I then modified the c:\python27\Lib\distutils\distutils.cfg file and set
different compilers in it --- these also failed.
The bottom line --- all my trials (attempts) failed for one reason or another.
Any suggestions as to how to proceed? I will be glad to provide any additional
information on my trials.
Does site.py provide a reliable means to reinitialize sys.path of a
I see that Python 2.7 added site.getsitepackages() which gives the
paths needed to reinitialize explicitly, but it appears that
virtualenv's site.py does not include this change.
To give some context, I'm interested in code reloading inside
virtualenvs on production systems where I would like to reduce
downtime to 0. Packages installed as non-zip eggs are added to a .pth
file, and attempts to reload(module) or even __import__ in a child
process do not get the new code since sys.path is still pointing at
the old paths given at the time the .pth file was read.