On Tue, Sep 14, 2010 at 7:36 AM, raymond.hettinger
> + # Can't use functools.wraps() here because of bootstrap issues
> + wrapper.__module__ = getattr(user_function, '__module__')
> + wrapper.__doc__ = getattr(user_function, '__doc__')
> + wrapper.__name__ = getattr(user_function, '__name__')
> + return wrapper
Perhaps add __wrapped__ as well?
(I assume that, similar to _collections before it was made a builtin,
the bootstrap issue is that _functools is an extension module rather
than builtin, but reprlib is needed when building the extension
Nick Coghlan | ncoghlan(a)gmail.com | Brisbane, Australia
I'd like to propose that the Python community standardize on a
"deferred" object for asynchronous return values, modeled after the
well-thought-out Twisted Deferred class.
With more and more Python libraries implementing asynchronicity (for
example Futures -- PEP 3148), it's crucial to have a standard deferred
object in place so that code using a single asynchronous reactor can
interoperate with different asynchronous libraries.
I think a lot of people don't realize how much cooler and more elegant
it is to return a deferred object from an asynchronous function rather
than using a generic callback approach (where you pass a function
argument to the asynchronous function telling it where to call when the
asynchronous operation completes).
While asynchronous systems have been shown to have excellent scalability
properties, the callback-based programming style often used in
asynchronous programming has been criticized for breaking up the
sequential readability of program logic.
This problem is elegantly addressed by using Deferred Generators. Since
Python 2.5 added enhanced generators (i.e. the capability for "yield" to
return a value), the infrastructure is now in place to allow an
asynchronous function to be written in a sequential style, without the
use of explicit callbacks.
See the following blog article for a nice write-up on the capability:
Mekk's Twisted Deferred example:
a = 1
b = yield deferredReturningFunction(a)
c = yield anotherDeferredReturningFunction(a, b)
What's cool about this is that between the two yield statements, the
Twisted reactor is in control meaning that other pending asynchronous
tasks can be attended to or the thread's remaining time slice can be
yielded to the kernel, yet this is all accomplished without the use of
multi-threading. Another interesting aspect of this approach is that
since it leverages on Python's enhanced generators, an exception thrown
inside either of the deferred-returning functions will be propagated
through to someFunction() where it can be handled with try/except.
Think about what this means -- this sort of emulates the "stackless"
design pattern you would expect in Erlang or Stackless Python without
leaving standard Python. And it's made possible under the hood by
Python Enhanced Generators.
Needless to say, it would be great to see this coolness be part of the
standard Python library, instead of having every Python asynchronous
library implement its own ad-hoc callback system.
I have now started an initial patch for PEP 384, in the pep-0384 branch.
This has the following features:
- modules can be compiled under Py_LIMITED_API
- Tools/scripts/abitype.py converts C code containing static
PyTypeObject definitions to use the new API for type definitions.
The following aspects are still missing:
- there is no support for generating python3.dll on Windows yet
- there has been no validation whether the API is actually feasible
to use in extension modules.
I started looking into porting the sqlite extension, and ran into
- certain fields of PyTypeObject are called directly:
- PyObject_Print is used, but can't be supported, as it uses a FILE*
For the first issue, it would be possible to provide a generic
accessor function that fetches fields from a type object. Alternatively,
each case could be considered, suggesting an alternative code for the
I'll be off the net for the next two weeks most of the time, so
I might not be able to respond quickly.
Anybody interested in advancing that patch, feel free to commit
changes into the branch.
My name is Prashant Kumar and I wish to contribute to the Python development
process by helping convert certain existing python
over to python3k.
Is there anyway I could obtain a list of libraries which need to be ported
over to python3k, sorted by importance(by importance i mean
packages which serve as a dependency for larger number of packages being
Victor changed this from "return NULL" to "goto fail" in r84730, claiming
that it would fix a reference leak. Is the leak somewhere else then?
Am 12.09.2010 18:40, schrieb benjamin.peterson:
> Author: benjamin.peterson
> Date: Sun Sep 12 18:40:53 2010
> New Revision: 84744
> use return NULL; it's just as correct
> Modified: python/branches/py3k/Objects/unicodeobject.c
> --- python/branches/py3k/Objects/unicodeobject.c (original)
> +++ python/branches/py3k/Objects/unicodeobject.c Sun Sep 12 18:40:53 2010
> @@ -769,7 +769,7 @@
> "PyUnicode_FromFormatV() expects an ASCII-encoded format "
> "string, got a non-ASCII byte: 0x%02x",
> (unsigned char)*f);
> - goto fail;
> + return NULL;
> /* step 2: allocate memory for the results of
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.
This is a follow up to PEP 3147. That PEP, already implemented in Python 3.2,
allows for Python source files from different Python versions to live together
in the same directory. It does this by putting a magic tag in the .pyc file
name and placing the .pyc file in a __pycache__ directory.
Distros such as Debian and Ubuntu will use this to greatly simplifying
deploying Python, and Python applications and libraries. Debian and Ubuntu
usually ship more than one version of Python, and currently have to play
complex games with symlinks to make this work. PEP 3147 will go a long way to
eliminating the need for extra directories and symlinks.
One more thing I've found we need though, is a way to handled shared libraries
for extension modules. Just as we can get name collisions on foo.pyc, we can
get collisions on foo.so. We obviously cannot install foo.so built for Python
3.2 and foo.so built for Python 3.3 in the same location. So symlink
nightmare's mini-me is back.
I have a fairly simple fix for this. I'd actually be surprised if this hasn't
been discussed before, but teh Googles hasn't turned up anything.
The idea is to put the Python version number in the shared library file name,
and extend .so lookup to find these extended file names. So for example, we'd
see foo.3.2.so instead, and Python would know how to dynload both that and the
traditional foo.so file too (for backward compatibility).
(On file naming: the original patch used foo.so.3.2 and that works just as
well, but I thought there might be tools that expect exactly a '.so' suffix,
so I changed it to put the Major.Minor version number to the left of the
extension. The exact naming scheme is of course open to debate.)
This is a much simpler patch than PEP 3147, though I'm not 100% sure it's the
right approach. The way this works is by modifying the configure and
Makefile.pre.in to put the version number in the $SO make variable. Python
parses its (generated) Makefile to find $SO and it uses this deep in the
bowels of distutils to decide what suffix to use when writing shared libraries
built by 'python setup.py build_ext'.
This means the patched Python only writes versioned .so files by default. I
personally don't see that as a problem, and it does not affect the test suite,
with the exception of one easily tweaked test. I don't know if third party
tools will care. The fact that traditional foo.so shared libraries will still
satisfy the import should be enough, I think.
The patch is currently Linux only, since I need this for Debian and Ubuntu and
wanted to keep the change narrow.
Other possible approaches:
* Extend the distutils API so that the .so file extension can be passed in,
instead of being essentially hardcoded to what Python's Makefile contains.
* Keep the dynload_shlib.c change, but modify the Debian/Ubuntu build
environment to pass in $SO to make (though the configure.in warning and
sleep is a little annoying).
* Add a ./configure option to enable this, which Debuntu's build would use.
The patch is available here:
and my working branch is here:
Please let me know what you think. I'm happy to just commit this to the py3k
branch if there are no objections <wink>. I don't think a new PEP is in
order, but an update to PEP 3147 might make sense.
On Sun, 12 Sep 2010 06:12:42 +0200 (CEST)
raymond.hettinger <python-checkins(a)python.org> wrote:
> - # Each link is stored as a list of length three: [PREV, NEXT, KEY].
> + # The back links are weakref proxies (to prevent circular references).
Are you sure this prevents anything? Since your list is circular,
forward links are enough to build a reference cycle.
Actually, this is exemplified here:
> + self.__root = root = _Link() # sentinel node for the doubly linked list
> + root.prev = root.next = root
`root.next = root` is enough to create a cycle, even if you make
root.prev a weakref.
What you need is to make both prev and next weakrefs. All list nodes
all held by the __map dict anyway.
If you are bothered about speed, an alternative would be a classic
finalization scheme, using a class-wide set of OrderedDict weakrefs that
remove themselves and clean up the doubly-linked list when the
OrderedDict becomes dead.
Date: Wed Sep 8 12:43:45 2010
New Revision: 84619
I guess you had to asciify Łukasz’ name because developers.txt is in
Latin-1, which cannot encode the first character. I think the file
should be recoded to UTF-8 so that we have no artificial restrictions on
people’s names. I’ll wait some days and if nobody disagrees I’ll recode
the file and fix the name.
On Sat, Sep 11, 2010 at 7:39 AM, amaury.forgeotdarc
> There is no need to bump the PYC magic number: the new opcode is used
> for code that did not compile before.
If the magic number doesn't change for 3.2, how will 3.1 know it can't
run pyc and pyo files containing these opcodes?
The magic number needs a bump or this change may cause SystemErrors
when older versions attempt to run affected pyc files.
Nick Coghlan | ncoghlan(a)gmail.com | Brisbane, Australia
The execution model section of the Python reference manual defines free
variables as follows:
"If a variable is used in a code block but not defined there, it is
a free variable"
This makes sense and fits the academic definition. The documentation of the
symtable module supports this definition - it says about is_free(): "return
True if the symbol is referenced in its block but not assigned to".
However, it appears that in the CPython front-end source code (in particular
the parts dealing with the symbol table), a free variables has a somewhat
stricter meaning. For example, in this chunk of code:
return cc * myparam
CPython infers that in 'internalfunc', while 'myparam' is free, 'cc' is
global because 'cc' isn't bound in the enclosing scope, although according
to the definitions stated above, both should be considered free. The
bytecode generated for loading cc and myparam is different, of course.
Is there a (however slight) inconsistency of terms here, or is it my
Thanks in advance,