[Python-ideas] Generators are iterators

Oscar Benjamin oscar.j.benjamin at gmail.com
Fri Dec 12 13:42:55 CET 2014

On 12 December 2014 at 10:34, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 12 December 2014 at 01:14, Oscar Benjamin <oscar.j.benjamin at gmail.com> wrote:
>> I think the PEP would be clearer if it properly acknowledged that the
>> problem is a problem for all iterators. The question then is why the
>> fix is only targeted at generators and what should be done about the
>> same problem that occurs in many other forms. The PEP rationale avoids
>> these issues by falsely claiming that generators are special.
> I believe you're misunderstanding the problem being solved.

And I believe you are too. :)

> The
> specific problem deemed worthy of being fixed is that the presence of
> "yield" in a function body can implicitly suppress StopIteration
> exceptions raised elsewhere in that function body (or in functions it
> calls).

The yield causes the function to become a generator function. The
frame for a generator function (like for any other function) will
allow uncaught exceptions to propagate to the frame above. The
difference between generator functions and other functions is that the
code in the body of a generator function is executed when someone
calls the generator's __next__ method. Since the caller (the iterator
consumer) is expecting StopIteration it is treated as the signalling
the end of iteration. The yield suppresses nothing; it is the iterator
consumer e.g. the for-loop or the list() function etc. which catches
the StopIteration and treats it as termination.

> The difference in behaviour between comprehensions and
> generator expressions when it comes to embedded function calls that
> trigger StopIteration is a special case of that more general
> difference.

I don't know what you mean by this.

> This is a problem unique to generators, it does not affect any other
> iterator (since explicit __next__ method implementations do not use
> yield).

Incorrect. The problem is not unique to generators and the yield is
irrelevant. The problem (insofar as it is) is a problem for all
iterators since all iterators interact with iterator-consumers and it
is the iterator-consumer that catches the StopIteration. Here is an
example using map:

>>> def func(x):
...     if x < 0:
...         raise StopIteration
...     return x ** 2
>>> it = map(func, [1, 2, 3, -1, 2])
>>> list(it)
[1, 4, 9]
>>> list(it)  # map continues to yield values...

Ordinarily map should terminate when the underlying iterator raises
StopIteration. Since it simply allows a StopIteration raised anywhere
to bubble up it will also allow the StopIteration raised by func to
bubble up. Implicitly the effect of that is that map agrees to
cooperatively terminate iteration when any part of the code that it
executes raises StopIteration. Note that it is the list() function
(the iterator-consumer) that actually catches the StopIteration and
terminates just as it would be if map were a generator:

>>> def map(func, iterable):
...     for item in iterable:
...         yield func(item) # func could raise StopIteration
>>> it = map(func, [1, 2, 3, -1, 2])
>>> list(it)
[1, 4, 9]
>>> list(it) # The exception terminates the frame in this case

In principle that could be a useful feature in some cases. In practise
it is more likely to be something that masks a bug manifesting in a
stray StopIteration e.g.:

>>> def first_line(filename):
...     with open(filename) as fin:
...         return next(fin)  # What happens if the file is empty
>>> for line in map(first_line, filenames):
...     print(line)

In this case it is the for-loop (the iterator-consumer) that catches
the StopIteration. Whether or not map is a generator or some other
iterator is irrelevant.

> The change in the PEP is to change that side effect such that
> those exceptions are converted to RuntimeError rather than silently
> suppressed - making generator function bodies behave less like
> __next__ method implementations.

Fine but generator function bodies are __next__ method
implementations. You can claim that the generator function implements
__iter__ but its body implements the __next__ method of the generator.
It is precisely because it is a __next__ method that it finds itself
being called by an iterator consumer and so finds that a loose
StopIteration is suppressed by the consumer. This is the same for all
iterators and there is nothing special about generators in this regard
(although the PEP will change that a little).

To summarise succinctly: Loose StopIteration can have an unintended
interaction with the iterator protocol and more specifically with
iterator-consumers. The effect is as if the iterator had been stopped
since the consumer will stop iteration. Because generators are
iterators they (like all iterators) are affected by this.


More information about the Python-ideas mailing list