Working on the PEP 342/343 generator enhancements, I've got working
send/throw/close() methods, but am not sure how to deal with getting
__del__ to invoke close(). Naturally, I can add a "__del__" entry to its
methods list easily enough, but the 'has_finalizer()' function in
gcmodule.c only checks for a __del__ attribute on instance objects, and for
tp_del on heap types.
It looks to me like the correct fix would be to check for tp_del always,
not just on heap types. However, when I tried this, I started getting
warnings from the tests, saying that 22 uncollectable objects were being
created (all generators, in test_generators).
It seems that the tests create cycles via globals(), since they define a
bunch of generator functions and then call them, saving the generator
iterators (or objects that reference them) in global variables
after investigating this a bit, it seems to me that either has_finalizer()
The principal use case was largely met by enumerate(). From PEP 276's
A common programming idiom is to take a collection of objects and apply
some operation to each item in the collection in some established
sequential order. Python provides the "for in" looping control
structure for handling this common idiom. Cases arise, however, where
it is necessary (or more convenient) to access each item in an "indexed"
collection by iterating through each index and accessing each item in
the collection using the corresponding index.
Also, while some nice examples are provided, the proposed syntax allows
and encourages some horrid examples as well:
>>> for i in 3: print i
The backwards compatability section lists another problematic
consequence; the following would stop being a syntax error and would
x, = 1
The proposal adds iterability to all integers but silently does nothing
for negative values.
A minor additional concern is that floats are not given an equivalent
capability (for obvious reasons) but this breaks symmetry with
range/xrange which still accept float args.
Introducing a new set of duplicate type names and deprecating old ones
causes a certain amount of disruption. Given the age of the types
module, the disruption is likely to be greater than any potential
benefit that could be realized. Plenty of people will have to incur the
transition costs, but no one will likely find the benefit to be
Suggest rejecting this PEP and making a note for Py3.0 to either sync-up
the type names or abandon the types module entirely.
Do we have *any* known use cases where we would actually run bytecode
that was suspicious enough to warrant running a well-formedness check?
In assessing security risks, the PEP notes, "Practically, it would be
difficult for a malicious user to 'inject' invalid bytecode into a PVM
for the purposes of exploitation, but not impossible."
Can that ever occur without there being a far greater risk of malicious,
but well-formed bytecode?
If you download a file, foo.pyc, from an untrusted source and run it in
a susceptible environment, does its well-formedness give you *any*
feeling of security. I think not.
There isn't anything wrong with having a verifier module, but I can't
think of any benefit that would warrant changing the bytecode semantics
just to facilitate one of the static stack checks.
The attached PEP (pep.txt) is for RFE 46738, which you can view here:
It provides a safe, documented class for serialization of simple python types.
A sample implementation is also attached (gherkin.py).
Critcism and comments on the PEP and the implementation are appreciated.
While the majority of Python users deem this to be a nice-to-have
feature, the community has been unable to reach a consensus on the
proper syntax after more than two years of intensive debate (the PEP was
introduced in early April 2003).
Most agree that there should be only-one-way-to-do-it; however, the
proponents are evenly split into two camps, with the modernists
preferring IX for nine and the classicists preferring VIIII which was
the most likely spelling in ancient Rome.
The classicists not only rely on set-in-stone tradition, they point to
pragmatic issues such as avoidance of subtraction, ease of coding,
easier mental parsing (much less error prone), and ease of teaching to
beginners. They assert that the modernists have introduced unnecessary
algorithmic complexity just to save two keystrokes.
The modernists point to compatible Java implementations and current
grade school textbooks. They believe that users from other languages
will expect the IX form. Note however, not all the modernists agree on
whether MXM would be a well-formed spelling of 1990; most, but not all
prefer MCMXC despite its likelihood of being mis-parsed on a first
There is also a small but vocal user group demanding that lowercase
forms be allowed. Their use cases fall into four categories: (i)
academia, (ii) the legal profession, (iii) research paper writing, and
(iv) powerpoint slideshows. Reportedly, this is also a common
convention among Perl programmers.
PyPI seems to be out of space:
% ./setup.py register --show-response
Using PyPI login from /home/niemeyer/.pypirc
There's been a problem with your request
psycopg.ProgrammingError: ERROR: could not extend relation "releases":
No space left on device
HINT: Check free disk space.
I've just submitted patch 1223381 (http://python.org/sf/1223381), which
implements code and test changes for:
* yield expressions
* bare yield (short for yield None)
* yield in try/finally
* generator.send(value) (send value into generator; substituted for PEP
* generator.throw(typ[,val[,tb]]) (raise error in generator)
* GeneratorExit built-in exception type
* generator.__del__ (well, the C equivalent)
* All necessary mods to the compiler, parser module, and Python 'compiler'
package to support these changes.
It was necessary to change a small part of the eval loop (well, the
initialization, not the loop) and the gc module's has_finalizer() logic in
order to support a C equivalent to __del__. Specialists in these areas
should probably scrutinize this patch!
There is one additional implementation detail that was not contemplated in
either PEP. in order to prevent used-up generators from retaining
unnecessary references to their frame's contents, I set the generator's
gi_frame member to None whenever the generator finishes normally or with an
error. Thus, an exhausted generator cannot be part of a cycle, and it
releases its frame object sooner than in previous Python versions. For
generators used only in a direct "for" loop, this makes no difference, but
for generators used with the iterator protocol (i.e. "gen.next()") from
Python, this avoids stranding the generator's frame in a traceback cycle.
Anyway, your comments/questions/feedback/bug reports are welcome.