On 2019-12-08 12:38 p.m., Chris Angelico wrote:
On Mon, Dec 9, 2019 at 1:57 AM Oscar Benjamin
wrote: On Sun, 8 Dec 2019 at 14:37, Chris Angelico
wrote: PEP 479 (https://www.python.org/dev/peps/pep-0479/) changed the rules around generators: if it would have leaked StopIteration, it instead raises RuntimeError. This converts hard-to-debug premature termination into easily-spotted exceptions, but it applies only to actual generator functions.
[snip]
I propose to grant PEP 479 semantics - namely, that a StopIteration during the calling of the mapped function be translated into a RuntimeError. Likewise for filter(), guarding the predicate function, and all similar functions in itertools: accumulate, filterfalse, takewhile/dropwhile, starmap, and any that I didn't notice.
The problem here is that in the case of generators there is a single place that you can fix this in the implementation of generators in the interpreter as part of the definition of the language. With iterators there are many iterator tools including in third party code outside of the stdlib. They won't all be fixed so it would remain the case that bare next should be discouraged in most uses and that there isn't a drop-in replacement for someone who just wants to use next without a default value.
Yes, not everything can be fixed. I don't think that means we shouldn't fix the things that ARE under our control - that is, the standard library.
If someone's using next without a default value, there are three likely possibilities:
1) The intention was specifically to reraise the StopIteration. Applicable only within a __next__ function, and still valid.
2) The intention was to have some other behaviour, and it's wrapped in try/except right at the call site.
3) The programmer never even thought about it, and is assuming the iterator is not empty/exhausted.
I use bare next() without a default value in tests sometimes, personally. I think it works pretty well for that.
Most consumers of iterables should be handling StopIteration right there (either by using a for loop, or with an actual try/except). The case of a first() function that led to this proposal is a perfect example: it'd be correct for it to raise ValueError on an empty iterable, so it should try/except and raise a different exception.
The trouble is that the third case is an extremely subtle one. It's all very well to state in the docs that "bare next [is] discouraged in most uses", but people will do it, and the correct thing to do is to report it as an exception. And that's exactly what happens in most situations. In the body of a for loop, or in a list comp, the StopIteration comes right on out as an exception. In a generator, it becomes RuntimeError. The ONLY problem is when a __next__ function has code in it that could raise/leak StopIteration, and doesn't catch that.
I think you will also find that much code is depending on map etc to behave in the current way. When learning the iterator protocol it seemed to me that being able to raise StopIteration from anywhere was a design feature.
From inside a map function? I'd like to see some examples here - a function that's designed to be called from inside map, which deliberately raises/leaks StopIteration, intending to halt the map cleanly. (Or equivalent with filter etc.) Be aware that code like this would break if spelled any way other than map():
def poof(x): if x == 5: raise StopIteration return x * 3 short_list = list(map(poof, range(10))) boom = [poof(x) for x in range(10)] boom = list(poof(x) for x in range(10)) for x in range(10): print(poof(x)) # boom
I'd call code like this "fragile".
ChrisA _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/RBIJTI... Code of Conduct: http://python.org/psf/codeofconduct/