[Python-ideas] Propagating StopIteration value

Guido van Rossum guido at python.org
Tue Oct 9 01:47:23 CEST 2012

On Mon, Oct 8, 2012 at 4:24 PM, Oscar Benjamin
<oscar.j.benjamin at gmail.com> wrote:
> On 8 October 2012 00:36, Guido van Rossum <guido at python.org> wrote:
>> On Sun, Oct 7, 2012 at 3:43 PM, Oscar Benjamin
>> <oscar.j.benjamin at gmail.com> wrote:
>>> I think what Serhiy is saying is that although pep 380 mainly
>>> discusses generator functions it has effectively changed the
>>> definition of what it means to be an iterator for all iterators:
>>> previously an iterator was just something that yielded values but now
>>> it also returns a value. Since the meaning of an iterator has changed,
>>> functions that work with iterators need to be updated.
>> I think there are different philosophical viewpoints possible on that
>> issue. My own perspective is that there is no change in the definition
>> of iterator -- only in the definition of generator. Note that the
>> *ability* to attach a value to StopIteration is not new at all.
> I guess I'm viewing it from the perspective that an ordinary iterator
> is simply an iterator that happens to return None just like a function
> that doesn't bother to return anything. If I understand correctly,
> though, it is possible for any iterator to return a value that yield
> from would propagate, so the feature (returning a value) is not
> specific to generators.

Substitute "pass a value via StopIteration" and I'll agree that it is

I still don't think it is all that useful, nor that it should be
encouraged (outside the use case of coroutines).

>>> This feature was new in Python 3.3 which was released a week ago
>> It's been in alpha/beta/candidate for a long time, and PEP 380 was
>> first discussed in 2009.
>>> so it is not widely used but it has uses that are not anything to do with
>>> coroutines.
>> Yes, as a shortcut for "for x in <iterator>: yield x". Note that the
>> for-loop ignores the value in the StopIteration -- would you want to
>> change that too?
> Not really. I thought about how it could be changed. Once APIs are
> available that use this feature to communicate important information,
> use cases will arise for using the same APIs outside of a coroutine
> context. I'm not really sure how you could get the value from a for
> loop. I guess it would have to be tied to the else clause in some way.

Given the elusive nature of StopIteration (many operations catch and
ignore it, and that's the main intended use) I don't think it should
be used to pass along *important* information except for the specific
case of coroutines, where the normal use case is to use .send()
instead of .__next__() and to catch the StopIteration exception.

>>> As an example of how you could use it, consider parsing a
>>> file that can contains #include statements. When the #include
>>> statement is encountered we need to insert the contents of the
>>> included file. This is easy to do with a recursive generator. The
>>> example uses the return value of the generator to keep track of which
>>> line is being parsed in relation to the flattened output file:
>>> def parse(filename, output_lineno=0):
>>>     with open(filename) as fin:
>>>         for input_lineno, line in enumerate(fin):
>>>             if line.startswith('#include '):
>>>                 subfilename = line.split()[1]
>>>                 output_lineno = yield from parse(subfilename, output_lineno)
>>>             else:
>>>                 try:
>>>                     yield parse_line(line)
>>>                 except ParseLineError:
>>>                     raise ParseError(filename, input_lineno, output_lineno)
>>>                 output_lineno += 1
>>>     return output_lineno
>> Hm. This example looks constructed to prove your point... It would be
>> easier to count the output lines in the caller. Or you could use a
>> class to hold that state. I think it's just a bad habit to start using
>> the return value for this purpose. Please use the same approach as you
>> would before 3.3, using "yield from" just as the shortcut I mentione
>> above.
> I'll admit that the example is contrived but it's to think about how
> to use the new feature rather than to prove a point (Otherwise I would
> have contrived a reason for wanting to use filter()). I just wanted to
> demonstrate that people can (and will) use this outside of a coroutine
> context.

Just that they will use it doesn't make it a good idea. I claim it's a
bad idea and I don't think you're close to convincing me otherwise.

> Also I envisage something like this being a common use case. The
> 'yield from' expression can only provide information to its immediate
> caller by returning a value attached to StopIteration or be raising a
> different type of exception. There will be many cases where people
> want to get some information about what was yielded/done by 'yield
> from' at the point where it is used.

Maybe. But I think we should wait a few years before we conclude that
we made a mistake. The story of iterators and generators has evolved
in many small steps, each informed by how the previous step turned
out. It's way too soon to say that the existence of yield-from
requires us to change all the other iterator algebra to preserve the
value from StopIteration.

I'll happily take this discussion up again after we've used it for a
couple of years though!

--Guido van Rossum (python.org/~guido)

More information about the Python-ideas mailing list