On Thu, Oct 20, 2016 at 3:38 AM, Random832
On Wed, Oct 19, 2016, at 11:51, Yury Selivanov wrote:
I'm -1 on the idea. Here's why:
1. Python is a very dynamic language with GC and that is one of its fundamental properties. This proposal might make GC of iterators more deterministic, but that is only one case.
There is a huge difference between wanting deterministic GC and wanting cleanup code to be called deterministically. We're not talking about memory usage here.
Currently, iterators get passed around casually - you can build on them, derive from them, etc, etc, etc. If you change the 'for' loop to explicitly close an iterator, will you also change 'yield from'? What about other forms of iteration? Will the iterator be closed when it runs out normally? This proposal is to iterators what 'with' is to open files and other resources. I can build on top of an open file fairly easily: @contextlib.contextmanager def file_with_header(fn): with open(fn, "w") as f: f.write("Header Row") yield f def main(): with file_with_header("asdf") as f: """do stuff""" I create a context manager based on another context manager, and I have a guarantee that the end of the main() 'with' block is going to properly close the file. Now, what happens if I do something similar with an iterator? def every_second(it): try: next(it) except StopIteration: return for value in it: yield value try: next(it) except StopIteration: break This will work, because it's built on a 'for' loop. What if it's built on a 'while' loop instead? def every_second_broken(it): try: while True: nextIit) yield next(it) except StopIteration: pass Now it *won't* correctly call the end-of-iteration function, because there's no 'for' loop. This is going to either (a) require that EVERY consumer of an iterator follow this new protocol, or (b) introduce a ton of edge cases. ChrisA