Nick Coghlan wrote:
Greg Ewing wrote:
I've had another idea about this. Suppose the close() method of a generator didn't complain about reaching a yield after GeneratorExit is raised, but simply raised it again, and continued doing so until either a return occured or an exception propagated out.
Seems to me this couldn't do any harm to a well- behaved generator, since it has to be prepared to deal with a GeneratorExit arising from any of its yield points.
It solves the returnvalue thing, but introduces a change for existing generators. Well-behaved generators would not be affected, but there might be generators in real use that relied on the ability to ignore close or code using such generators that relied on getting the RuntimeError. <sidetrack> If there is a use case for ignoring close, that would be better served by another new idea I just had, the "yield raise" expression. The purpose of this would be to raise an exception in the caller of "next", "send", "throw" or "close" *without* finalizing the generator. Extending my "averager" example a bit: def averager(start=0): count = 0 exc = None sum = start while 1: try: val = (yield) if exc is None else (yield raise exc) except GeneratorExit: return sum/count try: sum += val except BaseException as e: exc = e # will be reraised by "yield raise" above else: exc = None count += 1 avg = averager() avg.next() # start coroutine avg.send(1.0) try: avg.send('') # this raises a TypeError at the sum += val line, which is rerouted here by the yield raise except TypeError: pass avg.send(2.0) print avg.close() # still prints 1.5 The above code would be the main use for the feature. However, a side benefit would be that a generator that wanted to raise an exception instead of closing could use a "yield raise OtherException" as response to GeneratorExit. I am not saying we should add the "yield raise" feature to the PEP, just that I think this would be a better way to handle the "don't close me" cases. (I am not sure how it would fit into the PEP anyway) </sidetrack>
Greg Ewing wrote:
Yield-from would then no longer have the potential to create broken generators, we wouldn't have to treat GeneratorExit differently from any other exception, and Jacob could have his subgenerators that return values when you close them.
Only true because you have redefined it so that no generators are broken. If I understand you correctly, you are arguing that this change lets us throw GeneratorExit to the subiterator without trying to reraise it (my #2 from several mails back). That is clearly a plus in my book because it adheres to the inlining principle, but I don't think you need the loop in close for it to be better. Nick Coghlan wrote:
I think I'd prefer to see some arbitrary limit (500 seems like a nice round number) on the number of times that GeneratorExit would be thrown before giving up and raising RuntimeError, just so truly broken generators that suppressed GeneratorExit in an infinite loop would eventually trigger an exception rather than just appearing to hang.
Right. The possibility of turning a call that used to raise a RuntimeError into an infinite loop bothers me a bit. I also don't really see the use for it. GeneratorExit is an unambiguous signal to close, so I would expect the generator to handle it by closing (possibly with a final return value), or by raising an exception. Not doing so *should* be an error. There has been requests for a function that loops over the generator and returns the final result, but this version of close doesn't fit that use case because it uses throw(GeneratorExit) instead of next().
The basic idea seems sound though (Jacob's averager example really was nicer than mine).
Thank you Nick. I am glad you think so. To summarize, I am only +0.75 on this proposal. I think it would be better not to loop, still return the final value from close, and still just throw GeneratorExit to subiterators without trying to reraise. Cheers - Jacob