[Python-ideas] Possible PEP 380 tweak

Jacob Holm jh at improva.dk
Wed Oct 27 22:22:09 CEST 2010

On 2010-10-27 00:14, Nick Coghlan wrote:
> Jacob's "implications for PEP 380" exploration started to give me some
> doubts, but I think there are actually some flaws in his argument.

I'm not sure I made much of an argument.  I showed an example that
assumed the change I was suggesting and explained what the problem would
be without the change.  Let me try another example:

def filesum(fn):
    s = 0
    with fd in open(fn):
        for line in fd:
            s += int(line)
            yield   # be cooperative..
    return s

def multifilesum():
    a = yield from filesum('fileA')
    b = yield from filesum('fileB')
    return a+b

def main()
    g = multifilesum()
    for i in range(10):
        except StopIteration as e:
            r = e.value
        r = g.finish()

This tries to read at most 10 lines from 'fileA' + 'fileB' and returning
their sums when interpreting each line as an integer.  It works fine if
there are at most 10 lines but is broken if 'fileA' has more than 10
lines.  What's more, assuming latest PEP 380 + your "finish" and no
other changes I don't see a simple way of fixing it.
With my modification of your "finish" proposal you can add a few
try...except blocks to the code and it will "just work (tm)"...

> Accordingly, I would like to make one more attempt at explaining why I
> think throwing in a separate exception for this use case is valuable
> (and *doesn't* require any changes to PEP 380).

I am convinced that it does, at least if you want it to be useable with
yield-from.   But the same goes for any version that uses GeneratorExit.

> As I see it, there's a bit of a disconnect between many PEP 380 use
> cases and any mechanism or idiom which translates a thrown in
> exception into an ordinary StopIteration. If you expect your thrown in
> exception to always terminate the generator in some fashion, adopting
> the latter idiom in your generator will make it potentially unsafe to
> use in a "yield from" expression that isn't the very last yield
> operation in any outer generator.

Right.  This is the problem I'm trying to address by modifying the PEP

> Consider the following:
> def example(arg):
>   try:
>     yield arg
>   except GeneratorExit
>     return "Closed"
>   return "Finished"
> def outer_ok1(arg):  # close() after next() returns "Closed"
>   return yield from example(arg)
> def outer_ok2(arg): # close() after next() returns None
>   yield from example(arg)
> def outer_broken(arg): # close() after next() gives RuntimeError
>   val = yield from example(arg)
>   yield val
> # All 3 cases: close() before next() returns None
> # All 3 cases: close() after 2x next() returns None

Actually, AFAICT outer_broken will *not* give a RuntimeError on close()
after next().  This is due to the special-casing of GeneratorExit in PEP
380.  That special-casing is also the basis for both my suggested

In fact, in all 3 cases close() after next() would give None because the
"inner" return value is discarded and the GeneratorExit reraised.  Only
when called directly would the inner "example" function return "Closed"
on close() after next().

> Using close() to say "give me your return value" creates the risk of
> hitting those runtime errors in a generator's __del__ method, 

Not really.  Returning a value from close with no other changes does not
change the risk of that happening.  Of course I *do* think other changes
are necessary, but then we'll need to look at those before concluding
they are a problem...

> and
> exceptions in __del__ are always a bit ugly.

That they are.

> Keeping the "give me your return value" and "clean up your resources"
> concerns separate by adding a new method and thrown exception means
> that close() is less likely to unpredictably raise RuntimeError (and
> when it does, will reliably indicate a genuine bug in a generator
> somewhere that is suppressing GeneratorExit).
> As far as PEP 380's semantics go, I think it should ignore the
> existence of anything like GeneratorReturn completely. Either one of
> the generators in the chain will catch the exception and turn it into
> StopIteration, or they won't. If they convert it to StopIteration, and
> they aren't the last generator in the chain, then maybe what actually
> needs to happen at the outermost level is something like this:
> class GeneratorReturn(Exception): pass
> def finish(gen):
>   try:
>     gen.throw(GeneratorReturn) # Ask generator to wrap things up
>   except StopIteration as err:
>     if err.args:
>       return err.args[0]
>   except GeneratorReturn:
>     pass
>   else:
>     # Asking nicely didn't work, so force resource cleanup
>     # and treat the result as if the generator had already
>     # been exhausted or hadn't started yet
>     gen.close()
>   return None

This, I don't like.  If we have a distinct method for "finishing" a
generator and getting a return value, I want it to tell me if the return
value was arrived at in some other way.  Preferably with an exception,
as in:

def finish(self):
    if self.gi_frame is None:
        raise RuntimeError('finish() on exhausted/closed generator')
    except StopIteration as err:
        if err.args:
            return err.args[0]
    except GeneratorReturn:
        raise RuntimeError('generator ignored GeneratorReturn')
    return None

The point of "finish" as I see it is not the "closing" part, but the
"give me a result" part.

Anyway, I am (probably) not going to argue much further for this.  The
only new thing that is on the table here is the "finish" function, and
using a new exception.  The use of a new exception solves some of the
issues that you and Greg had earlier, but leaves the problem of using a
value-returning close/finish with yield-from. (And Guido doesn't like
it).  Since noone seems interested in even considering a change to the
PEP 380 expansion to fix this, i don't really see a any more I can
contribute at this point.

- Jacob

More information about the Python-ideas mailing list