On 20.11.2014 03:24, Chris Angelico wrote:
On Thu, Nov 20, 2014 at 1:06 PM, Steven D'Aprano email@example.com wrote:
I trust that we all expect to be able to factor out the raise into a helper function or method, yes? It truly would be surprising if this failed:
class MyIterator: def __iter__(self): return self def __next__(self): return something()
def something(): # Toy helper function. if random.random() < 0.5: return "Spam!" raise StopIteration
Now let's write this as a generator:
def gen(): while True: yield something()
which is much nicer than:
def gen(): while True: try: yield something() except StopIteration: return # converted by Python into raise StopIteration
Sure. There was a suggestion that "return yield from something()" would work, though, which - I can't confirm that this works, but assuming it does - would be a lot tidier. But there's still a difference. Your first helper function was specifically a __next__ helper. It was tied intrinsically to the iterator protocol. If you want to call a __next__ helper (or actually call next(iter) on something) inside a generator, you'll have to - if this change goes through - cope with the fact that generator protocol says "return" where __next__ protocol says "raise StopIteration". If you want a generator helper, it'd look like this:
def something(): # Toy helper function. if random.random() < 0.5: yield "Spam!"
def gen(): yield from something()
Voila! Now it's a generator helper, following generator protocol. Every bit as tidy as the original. Let's write a __getitem__ helper:
Hmm, I'm not convinced by these toy examples, but I did inspect some of my own code for incompatibility with the proposed change. I found that there really is only one recurring pattern I use that I'd have to change and that is how I've implemented several file parsers. I tend to write them like this:
def parser (file_object): while True: title_line = next(file_object) # will terminate after the last record
try: # read and process the rest of the record here except StopIteration: # this record is incomplete raise OSError('Invalid file format') yield processed_record
So I'm catching StopIteration raised by the underlying IOWrapper only if it occurs in illegal places (with regard to the file format the parser expects), but not when it indicates the end of a correct file. I always thought of letting the Error bubble up as a way to keep the parser transparent. Now in this case, I think, I would have to change this to:
def parser (io_object): while True: try: title_line = next(io_object) except StopIteration: return ...
which I could certainly do without too much effort, but could this be one of the more widespread sources of incompatibility that Steve imagines ?