use context managers for new-style "for" statement

An idea occurred to me about another way to achieve "new-style" for statements that finalize generators properly: 1. Modify for statements to accept context mangers in addition to iterables. If it gets a context manager, it calls __exit__ when the loop terminates for any reason. Otherwise, it does what it does now. 2. Add an optional __throw__ method to context managers. If the context manager has a __throw__ method, the for statement forwards uncaught exceptions within its body to the context manager. Thus, you can use: for i in closing(gen(x)): if you want the generator closed automatically. This would also work for files: for line in open(filename): We might also add a new context manager to contextlib to do both the close and the throw. Maybe call it throwing_to? for i in throwing_to(gen(x)): I would think that throwing_to would also do what closing does. This somewhat simplifies the common "with closing"/"for" pattern as well as adds support for the new close/throw generator methods without any new syntax. Comments? -bruce frederiksen

for i in closing(gen(x)):
if you want the generator closed automatically.
That doesn't really improve on what we have now: with closing(gen(x)) as g: for i in g: The proposed syntax puts to much on one-line and unnecessarily complicates another one of Python's fundamental tools.
This looks somewhat unattractive to my eyes.
Comments?
I think the "problem" you're solving isn't worth solving. Raymond ## untested recipe def closeme(iterable): it = iter(iterable) try: for i in it: yield i finally: it.close() # doesn't this do this same thing without any interpreter magic? for i in closeme(gen(x)): ... for i in chain.from_iterable(map(closeme, [it1, it2, it3, it4])): ...

Raymond Hettinger wrote:
collector and the fact that PEP 342 specifies that __del__ calls close on generators. This will not work reliably on Jython, IronPython or Pypy because none of these have reference counting collectors.
The closeme generator really adds nothing here, because it is just another generator that relies on either running off the end of the generator, or its close or throw methods to be called to activate the finally clause. This is identical to the generators that it is being mapped over. *Nothing* in python is defined to call the close or throw methods on generators, except for the generator __del__ method -- and that is *only* called reliably in CPython, and not in any of the other python implementations which may never garbage collect the generator if it's allocated near the end of the program run! I had generators with try/finally, and these fail on Jython and IronPython. What I ended up doing to get my program working on Jython was to convert all of my generators to return context managers. That way I could not accidentally forget to use a with statement with them. Thus: def gen(x): return itertools.chain.from_iterable(...) for i in gen(x): ... becomes the following hack: class chain_context(object): def __init__(self, outer_it): self.outer_it = outer_iterable(outer_it) def __enter__(self): return itertools.chain.from_iterable(self.outer_it) def __exit__(self, type, value, tb): self.outer_it.close() class outer_iterable(object): def __init__(self, outer_it): self.outer_it = iter(outer_it) self.inner_it = None def __iter__(self): return self def close(self): if hasattr(self.inner_it, '__exit__'): self.inner_it.__exit__(None, None, None) elif hasattr(self.inner_it, 'close'): self.inner_it.close() if hasattr(self.outer_it, 'close'): self.outer_it.close() def next(self): ans = self.outer_it.next() if hasattr(ans, '__enter__'): self.inner_it = ans return ans.__enter__() ans = iter(ans) self.inner_it = ans return ans def gen(x): return chain_context(...) with gen(x) as it: for i in it: ... Most of my generators used chain. Those that didn't went from: def gen(x): ... to: def gen(x): def gen2(x): ... return contextlib.closing(gen2(x)) This got the program working on Jython in a way that future maintenance on the program can't screw up, but it sure doesn't feel "pythonic"... -bruce frederiksen

On Fri, Feb 20, 2009, Raymond Hettinger wrote:
In addition to Bruce's other followup, saving a level of indention does have some utility. That's not enough by itself, of course. -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

Aahz wrote:
Yes, the *close* capability of generators can be exercised with an external "with closing" statement (except for itertools.chain, which has an inner generator which is inaccessible to the caller, but this proposal doesn't fix that problem either). But the improvement comes when you want to exercise the *throw* capability of generators within a for statement. Adding an optional __throw__ to context managers and honoring it in with statements doesn't let the generator convert the exception into a new yielded value to be tried ("oh, you didn't like the last value I yielded, try this one instead") which is one use of generator.throw. And this solves both issues (close and throw) with the same mechanism, so it's conceptually simpler for the user than two separate solutions. It also naturally extends the context manager capability to be able to encapsulate try/except patterns into re-usable context managers, in addition to encapsulating try/finally patterns. Here, I'm referring to the standard use of context managers in with statements, irrespective of the for statement. Oh, and, yes, it does allow the programmer to elide the use of the with statement in combination with the for statement. In my practice, 90% of the with statements I use are this with/for pattern so I imagine that this would also be appreciated by the Python community. But this is not the sole benefit, nor even the most important benefit. So while I might agree with "That's not enough by itself", I wish to point out that this last benefit is not "by itself". -bruce frederiksen

for i in closing(gen(x)):
if you want the generator closed automatically.
That doesn't really improve on what we have now: with closing(gen(x)) as g: for i in g: The proposed syntax puts to much on one-line and unnecessarily complicates another one of Python's fundamental tools.
This looks somewhat unattractive to my eyes.
Comments?
I think the "problem" you're solving isn't worth solving. Raymond ## untested recipe def closeme(iterable): it = iter(iterable) try: for i in it: yield i finally: it.close() # doesn't this do this same thing without any interpreter magic? for i in closeme(gen(x)): ... for i in chain.from_iterable(map(closeme, [it1, it2, it3, it4])): ...

Raymond Hettinger wrote:
collector and the fact that PEP 342 specifies that __del__ calls close on generators. This will not work reliably on Jython, IronPython or Pypy because none of these have reference counting collectors.
The closeme generator really adds nothing here, because it is just another generator that relies on either running off the end of the generator, or its close or throw methods to be called to activate the finally clause. This is identical to the generators that it is being mapped over. *Nothing* in python is defined to call the close or throw methods on generators, except for the generator __del__ method -- and that is *only* called reliably in CPython, and not in any of the other python implementations which may never garbage collect the generator if it's allocated near the end of the program run! I had generators with try/finally, and these fail on Jython and IronPython. What I ended up doing to get my program working on Jython was to convert all of my generators to return context managers. That way I could not accidentally forget to use a with statement with them. Thus: def gen(x): return itertools.chain.from_iterable(...) for i in gen(x): ... becomes the following hack: class chain_context(object): def __init__(self, outer_it): self.outer_it = outer_iterable(outer_it) def __enter__(self): return itertools.chain.from_iterable(self.outer_it) def __exit__(self, type, value, tb): self.outer_it.close() class outer_iterable(object): def __init__(self, outer_it): self.outer_it = iter(outer_it) self.inner_it = None def __iter__(self): return self def close(self): if hasattr(self.inner_it, '__exit__'): self.inner_it.__exit__(None, None, None) elif hasattr(self.inner_it, 'close'): self.inner_it.close() if hasattr(self.outer_it, 'close'): self.outer_it.close() def next(self): ans = self.outer_it.next() if hasattr(ans, '__enter__'): self.inner_it = ans return ans.__enter__() ans = iter(ans) self.inner_it = ans return ans def gen(x): return chain_context(...) with gen(x) as it: for i in it: ... Most of my generators used chain. Those that didn't went from: def gen(x): ... to: def gen(x): def gen2(x): ... return contextlib.closing(gen2(x)) This got the program working on Jython in a way that future maintenance on the program can't screw up, but it sure doesn't feel "pythonic"... -bruce frederiksen

On Fri, Feb 20, 2009, Raymond Hettinger wrote:
In addition to Bruce's other followup, saving a level of indention does have some utility. That's not enough by itself, of course. -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

Aahz wrote:
Yes, the *close* capability of generators can be exercised with an external "with closing" statement (except for itertools.chain, which has an inner generator which is inaccessible to the caller, but this proposal doesn't fix that problem either). But the improvement comes when you want to exercise the *throw* capability of generators within a for statement. Adding an optional __throw__ to context managers and honoring it in with statements doesn't let the generator convert the exception into a new yielded value to be tried ("oh, you didn't like the last value I yielded, try this one instead") which is one use of generator.throw. And this solves both issues (close and throw) with the same mechanism, so it's conceptually simpler for the user than two separate solutions. It also naturally extends the context manager capability to be able to encapsulate try/except patterns into re-usable context managers, in addition to encapsulating try/finally patterns. Here, I'm referring to the standard use of context managers in with statements, irrespective of the for statement. Oh, and, yes, it does allow the programmer to elide the use of the with statement in combination with the for statement. In my practice, 90% of the with statements I use are this with/for pattern so I imagine that this would also be appreciated by the Python community. But this is not the sole benefit, nor even the most important benefit. So while I might agree with "That's not enough by itself", I wish to point out that this last benefit is not "by itself". -bruce frederiksen
participants (3)
-
Aahz
-
Bruce Frederiksen
-
Raymond Hettinger