possible extension to how the new StopIteration is handled (python >=3.3)

Hi all, I just tried the new possibilities in writing generators which were included since python3.3 into the python standard. Concretely, using return 1 in a generator is equal to raise StopIteration(1) Thus the following works:
def f():
... yield 0 ... return 1
however calling next(g) once again gives the following:
mind the missing StopIteration: 1 . In my impression this is not like I intuitively think about a generator. If there is a StopIteration Exception connected to a generator, then there should be only ONE such. This is not only theoretical, but would have a real application using such generator in for loops. At the moment the following happens:
g = f()
Remind again the missing StopIteration: 1 . If the same StopIteration would be thrown at every call when g is empty, one could actually extract the return value (without using some maybe possible but more complicated return_value = yield from work arrounds). Any feedbacks? Are there others thinking that this would be a straightforward extension to the newly introduced raise StopIteration(return_value) feature? looking forward to your responses, with best wishes, Stephan

On Fri, Oct 3, 2014 at 3:39 AM, Stephan Sahm <Stephan.Sahm@gmx.de> wrote:
So basically, what you're saying is that the rule should change from "once StopIteration has been raised, any call to next() should also raise StopIteration" to "... raise StopIteration with the same payload". I think that's not unreasonable in the simple form; but it would mean that the generator would have to retain a reference to its return value, which is contrary to what most people will expect of function return values. It might make for unnecessary object longevity. Is there a backward compatibility issue here? It's theoretically possible for code to actively expect that a repeated StopIteration won't have a payload (eg to distinguish between the initial return and any attempt to re-next() the generator), but is that at all a reasonable thing to have done? ChrisA

Chris Angelico wrote:
This was debated during the yield-from discussions and decided against. If I remember rightly, unexpected longevity of return values was one of the main objections. -- Greg

On Fri, Oct 3, 2014 at 4:36 PM, Stephan Sahm <Stephan.Sahm@gmx.de> wrote:
In the general case, it is. But if you want it just for your own generators, it ought to be possible to write a decorator that yields-from your original function, retains the return value, and then reraises the exception repeatedly. Here's a simple version: def gen(): yield 0 return 1 class repeater: def __init__(self): self.g=gen() def __iter__(self): return self def __next__(self): if hasattr(self, "done"): raise StopIteration(self.done) try: return next(self.g) except StopIteration as exc: self.done=exc.value raise Generalizing this is left as an exercise for the reader. ChrisA

Thank you very much Chris! works like a charm. I already know a bit about generator, but I haven't thought that such is also possible. Impressively powerful language best, Stephan On 3 October 2014 09:01, Chris Angelico <rosuav@gmail.com> wrote:

On Fri, Oct 3, 2014 at 3:39 AM, Stephan Sahm <Stephan.Sahm@gmx.de> wrote:
So basically, what you're saying is that the rule should change from "once StopIteration has been raised, any call to next() should also raise StopIteration" to "... raise StopIteration with the same payload". I think that's not unreasonable in the simple form; but it would mean that the generator would have to retain a reference to its return value, which is contrary to what most people will expect of function return values. It might make for unnecessary object longevity. Is there a backward compatibility issue here? It's theoretically possible for code to actively expect that a repeated StopIteration won't have a payload (eg to distinguish between the initial return and any attempt to re-next() the generator), but is that at all a reasonable thing to have done? ChrisA

Chris Angelico wrote:
This was debated during the yield-from discussions and decided against. If I remember rightly, unexpected longevity of return values was one of the main objections. -- Greg

On Fri, Oct 3, 2014 at 4:36 PM, Stephan Sahm <Stephan.Sahm@gmx.de> wrote:
In the general case, it is. But if you want it just for your own generators, it ought to be possible to write a decorator that yields-from your original function, retains the return value, and then reraises the exception repeatedly. Here's a simple version: def gen(): yield 0 return 1 class repeater: def __init__(self): self.g=gen() def __iter__(self): return self def __next__(self): if hasattr(self, "done"): raise StopIteration(self.done) try: return next(self.g) except StopIteration as exc: self.done=exc.value raise Generalizing this is left as an exercise for the reader. ChrisA

Thank you very much Chris! works like a charm. I already know a bit about generator, but I haven't thought that such is also possible. Impressively powerful language best, Stephan On 3 October 2014 09:01, Chris Angelico <rosuav@gmail.com> wrote:
participants (3)
-
Chris Angelico
-
Greg Ewing
-
Stephan Sahm