[Python-ideas] Yield-From: GeneratorExit?
jh at improva.dk
Sun Mar 22 15:35:46 CET 2009
Greg Ewing wrote:
> I'm having trouble making up my mind how GeneratorExit
> should be handled.
> My feeling is that GeneratorExit is a peculiarity of
> generators that other kinds of iterators shouldn't have
> to know about.
They don't, see below.
> So, if you close() a generator, that
> shouldn't imply throwing GeneratorExit into the
> subiterator -- rather, the subiterator should simply
> be dropped and then the delegating generator finalized
> as usual.
> If the subiterator happens to be another generator,
> dropping the last reference to it will cause it to
> be closed, in which case it will raise its own
This is only true in CPython, but that shouldn't be a problem. If you
really need the subiterator to be closed at that point, wrapping the
yield-from in the appropriate try...finally... or with... block will do
> Other kinds of iterators can finalize
> themselves however they see fit, and don't need to
> pretend they're generators and understand
They don't have to understand GeneratorExit at all. As long as they know
how to clean up after themselves when thrown an exception they cannot
handle, things will just work. GeneratorExit is no different from
SystemExit or KeyboardInterrupt in that regard.
> For consistency, this implies that a GeneratorExit
> explicitly thrown in using throw() shouldn't be
> forwarded to the subiterator either, even if it has
> a throw() method.
I agree that if close() doesn't throw the GeneratorExit to the
subiterator, then throw() shouldn't either.
> To do otherwise would require making a distinction that
> can't be expressed in the Python expansion. Also, it
> seems elegant to preserve the property that if g is a
> generator then g.close() and g.throw(GeneratorExit) are
> exactly equivalent.
Not exactly equivalent, but related in the simple way described in PEP 342.
> What do people think about this?
If I understand you correctly, what you want can be described by the
_i = iter(EXPR)
_u = _i.next()
_v = yield _u
except BaseException, _e:
_m = getattr(_i, 'throw', None)
if _m is not None:
_u = _m(_e)
if _v is None:
_u = _i.next()
_u = _i.send(_v)
except StopIteration, _e:
RESULT = _e.value
_i = _u = _v = _e = _m = None
del _i, _u, _v, _e, _m
(except for minor details like the possible method caching). I like this
version because it makes it easier to share subiterators if you need to.
The explicit close in the earlier proposals meant that as soon as one
generator delegating to the shared iterator was closed, the shared one
would be as well. No, I don't have a concrete use case for this, but I
think it is the least surprising behavior we could choose for closing
shared subiterators. As mentioned above, you can still explicitly
request that the subiterator be closed with the delegating generator by
wrapping the yield-from in a try...finally... or with... block.
If I understand Nick correctly, he would like to drop the "except
GeneratorExit: raise" part, and possibly change BaseException to
Exception. I don't like the idea of just dropping the "except
GeneratorExit: raise", as that brings us back in the situation where
shared subiterators are less useful. If we also change BaseException to
Exception, the only difference is that it will no longer be possible to
throw exceptions like SystemExit and KeyboardInterrupt that don't
inherit from Exception to a subiterator. Again, I don't have a concrete
use case, but I think putting an arbitrary restriction like that in a
language construct is a bad idea. One example where this would cause
surprises is if you split part of a generator function (that for one
reason or another need to handle these exceptions) into a separate
generator and calls it using yield from. Throwing an exception to the
refactored generator could then have different meaning than before the
refactoring, and there would be no easy way to fix this.
Just my 2 cents...
More information about the Python-ideas