New submission from Chris Nehren:
I'm trying to package Clearsilver for OmniOS (illumos derivative). I built it incorrectly the first time round (causing a symbol to not be found in the python neo_cgi.so) and then fixed it. I fixed and reinstalled the package on the target machine but the .egg with the broken neo_cgi.so was still around, and wasn't regenerated when I ran the code again. This caused trac to reference the old, broken .so in the old, broken .egg, causing it to fail despite it having a perfectly serviceable neo_cgi.so right where it knows where to find it.
I would expect make(1)-like logic for regeneration of .egg files so that this sort of thing does not happen.
Alternately, if I've misunderstood something, I would be very grateful to pointed at tfm so I can read. Thank you!
title: .egg files not regenerated when .so file changes
Setuptools tracker <setuptools(a)bugs.python.org>
I'd like to take this moment to note how awesome importing from
zipfile is. Did you know you can execute a zip file with Python, and
if it has a __main__.py in the root it will be executed, with the
zipfile's contents on the PYTHONPATH?
python file.zip # runs!
I'm eagerly looking forward to the improvements that distutils2/packaging
will (hopefully) bring, and I very much appreciate the efforts of those
However, I'm a bit confused about the status. On
http://pypi.python.org/pypi/Distutils2 it says packaging is included in
Python 3.3, and on http://docs.python.org/dev/packaging/ there is guidance
for how to write a .cfg file. On the other hand, on
http://python.org/dev/peps/pep-0398/ (the 3.3 release schedule) it says
that the inclusion of packaging is deferred.
Which is it? Maybe it's a good idea to make the different docs consistent.
Almar Klein, PhD
phone: +31 6 19268652
[moved from python-dev]
> Possibly also useful (if only to standardise the approach) would be a
> method to check hashes against installed files.
Added as Distribution.check_installed_files() in distlib repo. Docstring:
Checks that the hashes and sizes of the files in ``RECORD`` are
matched by the files themselves. Returns a (possibly empty) list of
mismatches. Each entry in the mismatch list will be a tuple consisting
of the path, 'exists', 'size' or 'hash' according to what didn't match
(existence is checked first, then size, then hash), the expected
value and the actual value.
Noticed that there is no >= etc. in the environment markers spec, but
both environment markers implementations allow it. Shall I fix the
As of wheel 0.9.5, you can add to your setup.cfg lines to be included
as Requires-Dist: in the output METADATA. They supplement setup.py's
values. Only problem is they are only resolved during install from
wheel, and not during an install directly from the sdist.
requires_dist = beaglevote; python_version == 2.7
quux; sys.platform == 'win32'
Proposed edits to https://bitbucket.org/dholth/python-peps/changeset/9c26fa50
In wheel I use urlsafe_b64encode_nopad() which omits the trailing =
characters, but although very easy to implement isn't included in the
stdlib. In this spec I use the stdlib urlsafe_b64encode().
diff -r 23f9640c2020 -r 9c26fa508424 pep-0376.txt
--- a/pep-0376.txt Thu Sep 06 12:09:58 2012 -0400
+++ b/pep-0376.txt Thu Sep 06 12:24:40 2012 -0400
@@ -218,11 +218,16 @@
- an absolute path, using the local platform separator
-- the **MD5** hash of the file, encoded in hex. Notice that `pyc` and `pyo`
- generated files don't have any hash because they are automatically produced
- from `py` files. So checking the hash of the corresponding `py` file is
- enough to decide if the file and its associated `pyc` or `pyo` files have
+- a hash of the file's contents.
+ Notice that `pyc` and `pyo` generated files don't have any hash because
+ they are automatically produced from `py` files. So checking the hash
+ of the corresponding `py` file is enough to decide if the file and
+ its associated `pyc` or `pyo` files have changed.
+ The hash is either the empty string, the **MD5** hash of
+ the file, encoded in hex, or the hash algorithm as named in
+ ``hashlib.algorithms``, followed by the equals character ``=``,
+ followed by the hash digest as encoded with ``urlsafe_b64encode``.
- the file's size in bytes
@@ -391,9 +396,9 @@
And following methods:
-- ``get_installed_files(local=False)`` -> iterator of (path, md5, size)
+- ``get_installed_files(local=False)`` -> iterator of (path, hash, size)
- Iterates over the `RECORD` entries and return a tuple ``(path, md5, size)``
+ Iterates over the `RECORD` entries and return a tuple ``(path, hash, size)``
for each line. If ``local`` is ``True``, the path is transformed into a
local absolute path. Otherwise the raw value from `RECORD` is returned.
Bento's documentation and source code should be required reading for
anyone doing Python packaging work. It is actually well designed,
which comes as something of a shock if you are accustomed to hacking
on nested-subcommands-that-initialize-and-call-each-other. Bento
cleanly separates the build and install phases, it includes an
intermediate JSON metadata "ipkg.info" with the PEP metadata, install
paths, and file lists, it is easy to hack on. It needs a better lexer.
https://gist.github.com/3715068 - the intermediate (not human
editable) package info.
>On Wed Sep 12 19:47:39 CEST 2012, Donald Stufft wrote:
>On Wednesday, September 12, 2012 at 1:43 PM, Erik Bray wrote:
>> That said, this doesn't match my workflow at all. After releasing
>> "1.0" the next version is going to be "1.1", and any development
>> pre-release will be "1.1.devX". "1.1a" might not ever even exist. I
>> think others brought up this critique at the time PEP 386 was being
>> discussed, but then nothing was ever done about it >_>
> Yea, this concerned me because 1.1.devX < 1.1a1 < 1.1b1 < 1.1c1 < 1.1
> is how i've seen it used in the wild. Looks like most everyone i've seen
> using it so far has been doing it wrong. Don't think ive seen a single
> person do it right.
Hi, just yesterday i got bitten by this issue. FYI:
# verlib "pep386" (from https://bitbucket.org/tarek/distutilsversion)
>>> from verlib import NormalizedVersion as V
>>> V("0.2a1") < V("0.2.dev0") < V("0.2")
Also there is a bug in that version of verlib because it contradicts
>>> V("0.2rc1") < V("0.2")
# setuptools ¿?
>>> from pkg_resources import parse_version as V
>>> V("0.2.dev0") < V("0.2a1") < V("0.2")
>>> distutils.version.LooseVersion("0.2.dev0") < distutils.version.LooseVersion("0.2")
PD: i wasn't subcribed to the list, sorry if this mail breaks the "thread".
Some packages we maintain currently provide largely identical
side-by-side implementations of features: one implementation is written
in C, the other implementation is written in Python. The C module is
just an optimized version of the Python code.
There is logic at module scope within modules in the same package which
attempts to import the C version. If the C version cannot be imported
(an ImportError is raised), the logic falls back to importing the Python
from . import cmodule as module
from . import pymodule as module
This means that the package can be used if the C implementation is
compiled or not. It will run more slowly, but that's OK in lots of
In our setup.py for these kinds of distributions, we have code which
under Py2 uses a setuptools "feature" and under Py3 registers a
different kind of build_ext which, at install time, will:
- Attempt to compile the C if suitable build tools seem to exist on the
- If suitable build tools don't seem to exist on the target system, will
print a message and continue.
How can we support such a feature in the brave new declarative packaging