[Python-ideas] Possible PEP 380 tweak

Jacob Holm jh at improva.dk
Tue Oct 26 14:22:11 CEST 2010

On 2010-10-26 05:14, Guido van Rossum wrote:
> On Mon, Oct 25, 2010 at 6:35 PM, Jacob Holm <jh at improva.dk> wrote:
>> On 2010-10-25 17:13, Guido van Rossum wrote:
>>> Can you dig up the link here?
>> Well here is a recap of the end of the discussion about how to handle
>> generator return values and g.close().
> Thanks, very thorough!

I had to read through it myself to remember what actually happened, and
thought you (and the rest of the world) might as well benefit from the
notes I made.

>> The latest draft (#13) I have been able to find was announced in
>> http://mail.python.org/pipermail/python-ideas/2009-April/004189.html
>> And can be found at
>> http://mail.python.org/pipermail/python-ideas/attachments/20090419/c7d72ba8/attachment-0001.txt
> Hmm... It does look like the PEP editors dropped the ball on this one
> (or maybe Greg didn't mail it directly to them). It doesn't seem there
> are substantial differences with the published version at
> http://www.python.org/dev/peps/pep-0380/ though, close() still doesn't
> return a value.

IIRC, there are a few minor semantic differences in how non-generators
are handled.  I haven't made a detailed comparison.

>> I had some later suggestions for how to change the expansion, see e.g.
>> http://mail.python.org/pipermail/python-ideas/2009-April/004195.html  (I
>> find that version easier to reason about even now 1½ years later)
> Hopefully you & Greg can agree on a new draft. I like this to make
> progress and I really want this to appear in 3.3. But I don't have the
> time to do the editing and reviewing of the PEP.

IIRC, this was just a presentation issue - the two expansions were
supposed to be equivalent.  It might become relevant if we want to
change something in the definition, because we need a common base to
discuss from.  My version is (intended to be) simpler to reason about in
the sense that things that should be handled the same are only written once.

>> What killed the proposal last time was the question of what should
>> happen when you call g.close() on an exhausted generator.  My preferred
>> solution was (and is) that the generator should save the value from the
>> terminating StopIteration (or None if it ended by some other means) and
>> that g.close() should return that value each time and g.next(), g.send()
>> and g.throw() should raise a StopIteration with the value.
>> Unless you have changed your position on storing the return value, that
>> solution is dead in the water.
> I haven't changed my position. Closing a file twice doesn't do
> anything the second time either.


>> Here's a stupid idea... let g.close take an optional argument that it
>> can return if the generator is already exhausted and let it return the
>> value from the StopIteration otherwise.
>> def close(self, default=None):
>>    if self.gi_frame is None:
>>        return default
>>    try:
>>        self.throw(GeneratorExit)
>>    except StopIteration as e:
>>        return e.args[0]
>>    except GeneratorExit:
>>        return None
>>    else:
>>        raise RuntimeError('generator ignored GeneratorExit')
> You'll have to explain why None isn't sufficient.

It is not really necessary, but seemed "cleaner" somehow.  Think of
"g.close(default)" as "get me the result if possible, and this default
otherwise".  Then think of dict.get()...

An even cleaner solution might be Nicks "g.finish()" proposal, which I
will comment on separately.

>> I think these things (at least priming and close()) are mostly an issue
>> when using coroutines from non-coroutines.  That means it is likely to
>> be common in small examples where you write the whole program, but less
>> common when you are writing small(ish) parts of a larger framework.
>> Throwing and catching GeneratorExit is not common, and according to some
>> shouldn't be used for this purpose at all.
> Well, *throwing* it is close()'s job. And *catching* it ought to be
> pretty rare. Maybe this idiom would be better:
> def sum():
>   total = 0
>   try:
>     while True:
>       value = yield
>       total += value
>   finally:
>     return total

This is essentially the same as a bare except.  I think there is general
agreement that that is a bad idea.

>>> So, it is clear that generators are extremely versatile, and PEP 380
>>> deserves several good use cases to explain all the API subtleties.
>> I like your example because it matches the way I would have used
>> generators to solve it.  OTOH, it is not hard to rewrite parallel_reduce
>> as a traditional function.  In fact, the result is a bit shorter and
>> quite a bit faster so it is not a good example of what you need
>> generators for.
> I'm not sure I understand. Maybe you meant to rewrite it as a class?
> There's some state that wouldn't have a good place to live without
> either a class or a (generator) stackframe to survive.

See the reply by Peter Otten (and my reply to him).

You mentioned some possible extensions though.  At a guess, at least
some of these would benefit greatly from the use of generators.  Maybe
such an extension would be a better example?

- Jacob

More information about the Python-ideas mailing list