PEP 343 - Abstract Block Redux

I've written up the specs for my "PEP 340 redux" proposal as a separate PEP, PEP 343. http://python.org/peps/pep-0343.html Those who have been following the thread "Merging PEP 310 and PEP 340-redux?" will recognize my proposal in that thread, which received mostly positive responses there. Please review and ask for clarifications of anything that's unclear. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

At 05:13 PM 5/13/2005 -0700, Guido van Rossum wrote:
I've written up the specs for my "PEP 340 redux" proposal as a separate PEP, PEP 343.
http://python.org/peps/pep-0343.html
Those who have been following the thread "Merging PEP 310 and PEP 340-redux?" will recognize my proposal in that thread, which received mostly positive responses there.
Please review and ask for clarifications of anything that's unclear.
May I suggest this alternative translation in the "Specification" section: abc = EXPR __args = () # pseudo-variable, not visible to the user try: VAR = abc.__enter__() try: BLOCK except: __args = sys.exc_info() finally: abc.__exit__(*__args) While slighly more complex than the current translation, the current translation seems a bit misleading to me. OTOH, that may simply be because I see the *sys.exc_info() part and immediately wonder what happens when there was no exception, and sys.exc_info() contains some arbitrary previous data... Also, one question: will the "do protocol" be added to built-in "resource" types? That is, locks, files, sockets, and so on? Or will there instead be "macros" like the "opening" and "locking" templates? I notice that grammatically, "do gerund" works a lot better than "do noun", so all of your examples are words like locking, blocking, opening, redirecting, and so on. This makes it seem awkward for e.g. "do self.__lock", which doesn't make any sense. But the extra call needed to make it "do locking(self.__lock)" seems sort of gratuitous. It makes me wonder if "with" or "using" or some similar word that works better with nouns might be more appropriate, as then it would let us just add the resource protocol to common objects, and still read well. For example, a Decimal Context object might implement __enter__ by setting itself as the thread-local context, and __exit__ by restoring the previous context. "do aDecimalContext" doesn't make much sense, but "with aDecimalContext" or "using aDecimalContext" reads quite nicely.

[Phillip J. Eby]
May I suggest this alternative translation in the "Specification" section:
abc = EXPR __args = () # pseudo-variable, not visible to the user
try: VAR = abc.__enter__() try: BLOCK except: __args = sys.exc_info() finally: abc.__exit__(*__args)
Done (except you forgot to add a "raise" to the except claise).
While slighly more complex than the current translation, the current translation seems a bit misleading to me. OTOH, that may simply be because I see the *sys.exc_info() part and immediately wonder what happens when there was no exception, and sys.exc_info() contains some arbitrary previous data...
Right. Well, anyway, the actual implementation will just get the exception info from the try/finally infrastructure -- it's squirreled away somewhere on the stack even if sys.exc_info() (intentionally) doesn't have access to it.
Also, one question: will the "do protocol" be added to built-in "resource" types? That is, locks, files, sockets, and so on?
One person proposed that and it was shot down by Greg Ewing. I think it's better to require a separate wrapper.
Or will there instead be "macros" like the "opening" and "locking" templates? I notice that grammatically, "do gerund" works a lot better than "do noun", so all of your examples are words like locking, blocking, opening, redirecting, and so on. This makes it seem awkward for e.g. "do self.__lock", which doesn't make any sense. But the extra call needed to make it "do locking(self.__lock)" seems sort of gratuitous.
Maybe. There seems to be a surge of proponents for 'do' at the moment.
It makes me wonder if "with" or "using" or some similar word that works better with nouns might be more appropriate, as then it would let us just add the resource protocol to common objects, and still read well. For example, a Decimal Context object might implement __enter__ by setting itself as the thread-local context, and __exit__ by restoring the previous context. "do aDecimalContext" doesn't make much sense, but "with aDecimalContext" or "using aDecimalContext" reads quite nicely.
Maybe. I think we ought to implement the basic mechanism first and then decide how fancy we want to get, so I'd rather not get into this in the PEP. I'll add this to the PEP. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
Also, one question: will the "do protocol" be added to built-in "resource" types? That is, locks, files, sockets, and so on?
One person proposed that and it was shot down by Greg Ewing. I think it's better to require a separate wrapper.
It depends on whether the resource is "reusable". It would be okay for locks since you can lock and unlock the same lock as many times as you want, but files and sockets can only be used once, so there has to be something else around them. ALthough if we use 'do', we might want to use wrappers anyway for readability, even if they're not semantically necessary. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

At 04:57 PM 5/16/2005 +1200, Greg Ewing wrote:
Guido van Rossum wrote:
Also, one question: will the "do protocol" be added to built-in "resource" types? That is, locks, files, sockets, and so on?
One person proposed that and it was shot down by Greg Ewing. I think it's better to require a separate wrapper.
It depends on whether the resource is "reusable".
Why? If "with" is a "scope statement", then it doesn't make any sense to use it with something you intend to reuse later. The statement itself is an assertion that you intend to "release" the resource at the end of the block, for whatever "release" means to that object. Releasing a file is obviously closing it, while releasing a lock is obviously unlocking it.

Phillip J. Eby wrote:
Why? If "with" is a "scope statement", then it doesn't make any sense to use it with something you intend to reuse later. The statement itself is an assertion that you intend to "release" the resource at the end of the block, for whatever "release" means to that object. Releasing a file is obviously closing it, while releasing a lock is obviously unlocking it.
I've stopped arguing against giving with-protocol methods to files, etc., since it was pointed out that you'll get an exception if you try to re-use one. I still think it's conceptually cleaner if the object you use to access the resource is created by the __enter__ method rather than being something pre- existing, but I'm willing to concede that PBP here. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

[Greg Ewing]
I've stopped arguing against giving with-protocol methods to files, etc., since it was pointed out that you'll get an exception if you try to re-use one.
I still think it's conceptually cleaner if the object you use to access the resource is created by the __enter__ method rather than being something pre- existing, but I'm willing to concede that PBP here.
PBP? Google finds "Python Browser Poseur" but no definition of IRC slang. Proven Best Practice? Pakistani Border Patrol? Puffing Billy Posse? -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
[Greg Ewing]
I still think it's conceptually cleaner if the object you use to access the resource is created by the __enter__ method rather than being something pre- existing, but I'm willing to concede that PBP here.
PBP? Google finds "Python Browser Poseur" but no definition of IRC slang. Proven Best Practice? Pakistani Border Patrol? Puffing Billy Posse?
Previously Belaboured Point? (Just guessing from context here, but if I'm right, that's one acronym I'm going to have to remember. . .) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

On 5/17/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
Guido van Rossum wrote:
[Greg Ewing]
I still think it's conceptually cleaner if the object you use to access the resource is created by the __enter__ method rather than being something pre- existing, but I'm willing to concede that PBP here.
PBP? Google finds "Python Browser Poseur" but no definition of IRC slang. Proven Best Practice? Pakistani Border Patrol? Puffing Billy Posse?
Previously Belaboured Point? (Just guessing from context here, but if I'm right, that's one acronym I'm going to have to remember. . .)
Practicality Beats Purity, surely...? Paul

Paul Moore wrote:
On 5/17/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
Previously Belaboured Point? (Just guessing from context here, but if I'm right, that's one acronym I'm going to have to remember. . .)
Practicality Beats Purity, surely...?
D'oh! *slaps forehead* Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Nick Coghlan wrote:
Paul Moore wrote:
On 5/17/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
Previously Belaboured Point? (Just guessing from context here, but if I'm right, that's one acronym I'm going to have to remember. . .)
Practicality Beats Purity, surely...?
D'oh! *slaps forehead*
Cheers, Nick.
Hmmm... looks like Google needs a "Search Only in Python Terminology" radio button... Greg

Phillip J. Eby wrote:
This makes it seem awkward for e.g. "do self.__lock", which doesn't make any sense. But the extra call needed to make it "do locking(self.__lock)" seems sort of gratuitous.
How about do holding(self.__lock): ...
It makes me wonder if "with" or "using" or some similar word that works better with nouns might be more appropriate ... For example, a Decimal Context object might implement __enter__ by setting itself as the thread-local context, and __exit__ by restoring the previous context. "do aDecimalContext" doesn't make much sense, but "with aDecimalContext" or "using aDecimalContext" reads quite nicely.
It doesn't work so well when you don't already have an object with one obvious interpretation of what you want to do 'with' it, e.g. you have a pathname and you want to open a file. I've already argued against giving file objects __enter__ and __exit__ methods. And I'm -42 on giving them to strings. :-) -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

At 04:51 PM 5/16/2005 +1200, Greg Ewing wrote:
Phillip J. Eby wrote:
This makes it seem awkward for e.g. "do self.__lock", which doesn't make any sense. But the extra call needed to make it "do locking(self.__lock)" seems sort of gratuitous.
How about
do holding(self.__lock):
I simply mean that having to have any wrapper at all for common cases seems silly.
It doesn't work so well when you don't already have an object with one obvious interpretation of what you want to do 'with' it, e.g. you have a pathname and you want to open a file.
Um, what's wrong with 'with open("filename") as f'?
I've already argued against giving file objects __enter__ and __exit__ methods. And I'm -42 on giving them to strings. :-)
If strings had them, __enter__ would return self, and __exit__ would do nothing. I fail to see a problem. :)

Guido van Rossum wrote:
I've written up the specs for my "PEP 340 redux" proposal as a separate PEP, PEP 343.
http://python.org/peps/pep-0343.html
Those who have been following the thread "Merging PEP 310 and PEP 340-redux?" will recognize my proposal in that thread, which received mostly positive responses there.
Please review and ask for clarifications of anything that's unclear.
+1 here. The stdout redirection example needs to be corrected to avoid yielding inside a try/finally though: 5. Redirect stdout temporarily: @do_template def redirecting_stdout(new_stdout): save_stdout = sys.stdout try: sys.stdout = new_stdout except: sys.stdout = save_stdout raise else: yield None sys.stdout = save_stdout Used as follows: do opening(filename, "w") as f: do redirecting_stdout(f): print "Hello world" This could be left as the more elegant original if iterator finalisation (e.g. using a "__finish__()" slot) came in at the same time as user defined statements, allowing the above to be written naturally with try/finally. Arnold deVos's HTML tagging example would need access to the exception information and could be rewritten as a class: def tag(object): def __init__(self, name): self.name = cgi.escape(name) def __enter__(self): print '<%s>' % self.name return self.name def __exit__(self, *exc_info): if not exc_info or exc_info[0] is None: print '</%s>' % self.name Used as follows:: do tag('html'): do tag('head'): do tag('title'): print 'A web page' do tag('body'): for par in pars: do tag('p'): print par Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Nick Coghlan]
The stdout redirection example needs to be corrected to avoid yielding inside a try/finally though:
Thanks -- fixed now.
This could be left as the more elegant original if iterator finalisation (e.g. using a "__finish__()" slot) came in at the same time as user defined statements, allowing the above to be written naturally with try/finally.
Let's not try to tie this to other features. I tried that with PEP 340 and you know the mess it became. :-) -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
I've written up the specs for my "PEP 340 redux" proposal as a separate PEP, PEP 343.
http://python.org/peps/pep-0343.html
Those who have been following the thread "Merging PEP 310 and PEP 340-redux?" will recognize my proposal in that thread, which received mostly positive responses there.
Please review and ask for clarifications of anything that's unclear.
On the keyword front, the two keyword choices affect the naming conventions of templates differently, and I think need to be considered in that light. The naming convention for 'do' is shown in the current PEP 343. The issue I've noticed with it is that *functions* read well, but methods don't because things get out of sequence. That is, "do locking(the_lock)" reads well, but "do the_lock.locking()" does not. Whereas, using 'with', it can be written either way, and still read reasonably well ("with locked(the_lock)", "with the_lock.locked()"). The 'with' keyword also reads better if objects natively support use in 'with' blocks ("with the_lock", "with the_file"). Guido's concern regarding file objects being reused inappropriately can be dealt with in the file __enter__ method: def __enter__(self): if self.closed: raise RuntimeError, "Cannot reopen closed file handle" Template generators have the exact same problem with reusability - the solution used there is raising a RuntimeError when __enter__() is called inappropriately. This would make sense as a standard idiom - if a statement template can't be reused, attempting to do so should trigger a RuntimeError the second time __enter__() is invoked. For files, it may then become the common practice to keep pathnames around, rather than open file handles. When you actually needed access to the file, the existing "open" builtin would suffice: with open(filename, "rb") as f: for line in f: print line I've written out the PEP 343 examples below, assuming types acquire native with statement support (including Decimal contexts - I also give PEP 343 style code for Decimal contexts). PEP343 examples: 'with' keyword, native support in objects 1. A template for ensuring that a lock, acquired at the start of a block, is released when the block is left: # New methods on lock objects def __enter__(self): self.acquire() def __exit__(self, *args): self.release() Used as follows: with myLock: # Code here executes with myLock held. The lock is # guaranteed to be released when the block is left (even # if via return or by an uncaught exception). 2. A template for opening a file that ensures the file is closed when the block is left: # New methods on file objects def __enter__(self): if self.closed: raise RuntimeError, "Cannot reopen closed file handle" def __exit__(self, *args): self.close() Used as follows: with open("/etc/passwd") as f: for line in f: print line.rstrip() 3. A template for committing or rolling back a database transaction; this is written as a class rather than as a decorator since it requires access to the exception information: class transaction: def __init__(self, db): self.db = db def __enter__(self): self.db.begin() def __exit__(self, *args): if args and args[0] is not None: self.db.rollback() else: self.db.commit() Used as follows: with transaction(db): # Exceptions in this code cause a rollback 5. Redirect stdout temporarily: @with_template def redirected_stdout(new_stdout): save_stdout = sys.stdout sys.stdout = new_stdout yield None sys.stdout = save_stdout Used as follows: with open(filename, "w") as f: with redirected_stdout(f): print "Hello world" This isn't thread-safe, of course, but neither is doing this same dance manually. In a single-threaded program (e.g., a script) it is a totally fine way of doing things. 6. A variant on opening() that also returns an error condition: @with_template def open_w_error(filename, mode="r"): try: f = open(filename, mode) except IOError, err: yield None, err else: yield f, None f.close() Used as follows: with open_w_error("/etc/passwd", "a") as f, err: if err: print "IOError:", err else: f.write("guido::0:0::/:/bin/sh\n") 7. Another useful example would be an operation that blocks signals. The use could be like this: from signal import blocked_signals with blocked_signals(): # code executed without worrying about signals An optional argument might be a list of signals to be blocked; by default all signals are blocked. The implementation is left as an exercise to the reader. 8. Another use for this feature is the Decimal context. # New methods on decimal Context objects def __enter__(self): self._old_context = getcontext() setcontext(self) def __exit__(self, *args): setcontext(self._old_context) Used as follows: with decimal.Context(precision=28): # Code here executes with the given context # The context always reverts after this statement For comparison, the equivalent PEP 343 code is: @do_template def with_context(context): old_context = getcontext() setcontext(context) yield None setcontext(old_context) Used as: do decimal.with_context(decimal.Context(precision=28)): # Code here executes with the given context # The context always reverts after this statement -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Nick Coghlan wrote:
The naming convention for 'do' is shown in the current PEP 343. The issue I've noticed with it is that *functions* read well, but methods don't because things get out of sequence. That is, "do locking(the_lock)" reads well, but "do the_lock.locking()" does not.
Whereas, using 'with', it can be written either way, and still read reasonably well ("with locked(the_lock)", "with the_lock.locked()").
The 'with' keyword also reads better if objects natively support use in 'with' blocks ("with the_lock", "with the_file").
Guido's concern regarding file objects being reused inappropriately can be dealt with in the file __enter__ method:
def __enter__(self): if self.closed: raise RuntimeError, "Cannot reopen closed file handle"
For files, it may then become the common practice to keep pathnames around, rather than open file handles. When you actually needed access to the file, the existing "open" builtin would suffice:
with open(filename, "rb") as f: for line in f: print line
I think I'm starting to agree. Currently about +0.6 in favour of 'with' now, especially if this is to be almost exclusively a resource-acquisition statement, as all our use cases seem to be. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

Guido van Rossum wrote:
I've written up the specs for my "PEP 340 redux" proposal as a separate PEP, PEP 343.
http://python.org/peps/pep-0343.html
Those who have been following the thread "Merging PEP 310 and PEP 340-redux?" will recognize my proposal in that thread, which received mostly positive responses there.
Please review and ask for clarifications of anything that's unclear.
intuitively, I'm -1 on this proposal. unlike the original design, all you get from this is the ability to add try/finally blocks to your code without ever writing a try/finally-clause (neither in your code or in the block controller). that doesn't strike me as especially pythonic. (neither does the argument that just because you can use a mechanism to write inscrutable code, such a mechanism must not be made available feel right; Python's design has always been about careful tradeoffs between policy and mechanism, but this is too much policy for my taste. the original PEP 340 might have been too clever, but this reduced version feels pretty pointless). </F>

Fredrik Lundh wrote:
unlike the original design, all you get from this is the ability to add try/finally blocks to your code without ever writing a try/finally-clause (neither in your code or in the block controller). that doesn't strike me as especially pythonic.
I think the key benefit relates to the fact that correctly written resource management code currently has to be split it into two pieces - the first piece before the try block (e.g. 'lock.acquire()', 'f = open()'), and the latter in the finally clause (e.g. 'lock.release()', 'f.close()'). PEP 343 (like PEP 310 before it) makes it possible to define the correct resource management *once*, and then invoke it via a 'with' (or 'do') statement. Instead of having to check for "is this file closed properly?", as soon as you write or see "with open(filename) as f:", you *know* that that file is going to be closed correctly. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Nick Coghlan wrote:
I think the key benefit relates to the fact that correctly written resource management code currently has to be split it into two pieces - the first piece before the try block (e.g. 'lock.acquire()', 'f = open()'), and the latter in the finally clause (e.g. 'lock.release()', 'f.close()').
PEP 343 (like PEP 310 before it) makes it possible to define the correct resource management *once*, and then invoke it via a 'with' (or 'do') statement.
sure, but even if you look at both the application code *and* the resource management, there are no clues that the "with" statement is really just a masked "try/finally" statement. just look at the generator example: acquire yield release what in this snippet tells you that the "release" part will run even if the external block raises an exception? you could at least change that to acquire try: yield finally: release which would make it a lot more obvious what's going on here. also, come to think of it, adding a new statement just to hide try/finally statements is a waste of statement space. why not just enhance the existing try statement? let try with opening(file) as f: body except IOError: deal with the error (you have to do this anyway) behave like try: f = opening(file) try: try: body except: exc = sys.exc_info() else: exc = None finally: f.__cleanup__(exc) except IOError: deal with the error and you're done. (or you could use __enter__ and __exit__ instead, which would give you a variation of PEP-343-as-I-understand-it) compared to a separate statement, the worst that can happen here, usage-wise, is that you'll end up adding an "except: raise" line here and there to propagate real exceptions rather than dealing with them in place. on the other hand, for the cases where you actually want to deal with the exceptions, you'll save a line of code. I think that's a net win. but I still think that something closer to the original PEP 340 is a lot more useful. </F>

On 5/14/05, Fredrik Lundh <fredrik@pythonware.com> wrote:
Nick Coghlan wrote:
PEP 343 (like PEP 310 before it) makes it possible to define the correct resource management *once*, and then invoke it via a 'with' (or 'do') statement.
This is probably the main point for me - encapsulate the try...finally dance in such a way that the two parts are not separated by an (arbitrarily long) chunk of code. I hated the equivalent dances in C (usually malloc/free stuff in that case) and it feels awkward that this is the one real ugliness of C that Python hasn't fixed for me :-)
sure, but even if you look at both the application code *and* the resource management, there are no clues that the "with" statement is really just a masked "try/finally" statement. just look at the generator example:
acquire yield release
what in this snippet tells you that the "release" part will run even if the external block raises an exception?
I agree with this point, though. What I liked about the original PEP 340 was the fact that the generator was a template, with "yield" acting as a "put the block here" placeholder.
but I still think that something closer to the original PEP 340 is a lot more useful.
Overall, I'd agree. I'm still fond of the original PEP 340 in all its glory - the looping issue was a wart, but PEP 343 seems so stripped down as to be something entirely different, not just a fix to the looping issue. My view - PEP 343 get a +1 in preference to PEP 310. I'd like to see PEP 342 - that gets a +1 from me. Covering both these areas at once, PEP 340 would still probably be my preference, though. (I'm not convinced there's much chance of resurrecting it, though). Even it its limited form, I prefer PEP 343 to the status quo, though. Oh, and I'm leaning towards "with" as a keyword again, as a result of the "works better with member functions" argument. Paul.

Paul Moore wrote:
On 5/14/05, Fredrik Lundh <fredrik@pythonware.com> wrote:
Nick Coghlan wrote:
PEP 343 (like PEP 310 before it) makes it possible to define the correct resource management *once*, and then invoke it via a 'with' (or 'do') statement.
This is probably the main point for me - encapsulate the try...finally dance in such a way that the two parts are not separated by an (arbitrarily long) chunk of code. I hated the equivalent dances in C (usually malloc/free stuff in that case) and it feels awkward that this is the one real ugliness of C that Python hasn't fixed for me :-)
sure, but even if you look at both the application code *and* the resource management, there are no clues that the "with" statement is really just a masked "try/finally" statement. just look at the generator example:
acquire yield release
what in this snippet tells you that the "release" part will run even if the external block raises an exception?
I agree with this point, though. What I liked about the original PEP 340 was the fact that the generator was a template, with "yield" acting as a "put the block here" placeholder.
I also think the "generator as statement template" works much better if the __exit__ method is able to inject the exception into the generator frame, rather than always calling next(). Maybe PEP 343 should drop any suggestion of using generators to define these things, and focus on the PEP 310 style templates. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Nick Coghlan]
Maybe PEP 343 should drop any suggestion of using generators to define these things, and focus on the PEP 310 style templates.
But then the reason for separating VAR from EXPR becomes unclear. Several people have mentioned that they thought this was "a good idea on its own", but without giving additional use cases. Without the ability to write the acquire/release template as a generator, the big question is, "why not just PEP 310" ? -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
But then the reason for separating VAR from EXPR becomes unclear. Several people have mentioned that they thought this was "a good idea on its own", but without giving additional use cases. Without the ability to write the acquire/release template as a generator, the big question is, "why not just PEP 310" ?
Here's another use case: In PyGUI, in order to abstract the various ways that different platforms deal with drawing contexts, my widgets currently have a method widget.with_canvas(func) where you define func as def func(canvas): # do drawing operations on the canvas The canvas is a separate object from the widget so that it's harder to make the mistake of trying to draw to the widget outside of the appropriate context. The with-statement form of this would be with widget.canvas() as c: # do drawing operations on c Keeping the VAR and EXPR separate in this case better reflects the semantics of the original with_canvas() function. The canvas is strictly a local object produced as a result of executing the __enter__ method, which helps ensure that the correct protocol is followed -- if you haven't called __enter__, you don't have a canvas, so you can't do any drawing. On the other hand, this leads to some awkwardness in the naming conventions. The canvas() method, despite its name, doesn't actually return a canvas, but another object which, when used in the right way, produces a canvas. In general, the names of methods for use in a with-statement will need to be named according to the object which is bound to the VAR, rather than what they actually return. I'm not sure whether this is a problem or not. There's already a similar situation with generators, which are more usefully named according to what they yield, rather than what they return when considered as a function. I just feel a bit uncomfortable giving my widgets a function called canvas() that doesn't return a canvas. The alternative is that the canvas() method *does* return a canvas, with __enter__ and __exit__ methods, and the rule that you have to use the appropriate protocol before you can use it. This would avoid most of the aforementioned problems. So I think I may have just talked myself out of what I was originally intending to argue! -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

At 01:55 PM 5/14/2005 +0200, Fredrik Lundh wrote:
also, come to think of it, adding a new statement just to hide try/finally statements is a waste of statement space. why not just enhance the existing try statement? let
try with opening(file) as f: body except IOError: deal with the error (you have to do this anyway)
behave like
try: f = opening(file) try: try: body except: exc = sys.exc_info() else: exc = None finally: f.__cleanup__(exc) except IOError: deal with the error
and you're done. (or you could use __enter__ and __exit__ instead, which would give you a variation of PEP-343-as-I-understand-it)
I like this, if you take out the "with" part, change the method names to __try__ and __finally__, and allow "try" to work as a block on its own if you've specified an expression. i.e.: try opening(filename) as f: # do stuff try locking(self.__lock): # do stuff try redirecting_stdout(f): # something try decimal.Context(precision=23): # okay, this one's a little weird try self.__lock: # and so's this; nouns read better with a gerund wrapper and I'd make the translation be: try: __exc = () VAR = EXPR.__try__() try: try: BODY except: __exc = sys.exc_info() raise finally: EXPR.__finally__() # except/else/finally clauses here, if there were any in the original try
but I still think that something closer to the original PEP 340 is a lot more useful.
I agree, but mainly because I'd like to be able to allow try/finally around "yield" in generators, be able to pass exceptions into generators, and tell generators to release their resources. :) I do think that the PEP 340 template concept is much more elegant than the various PEP 310-derived approaches, though.

Fredrik Lundh wrote:
try with opening(file) as f: body except IOError: deal with the error (you have to do this anyway)
You don't usually want to do it right *there*, though. More likely you'll have something further up that deals with a variety of possible errors. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

[Fredrik Lundh]
intuitively, I'm -1 on this proposal.
So we need to do better. Do you just prefer all of PEP 340? What about the objections against it? The mostly unnecessary loopiness in particular?
unlike the original design, all you get from this is the ability to add try/finally blocks to your code without ever writing a try/finally-clause (neither in your code or in the block controller). that doesn't strike me as especially pythonic.
Would it be better if we pulled back in the generator exit handling from PEP 340? That's a pretty self-contained thing, and would let you write try/finally around the yield.
(neither does the argument that just because you can use a mechanism to write inscrutable code, such a mechanism must not be made available feel right; Python's design has always been about careful tradeoffs between policy and mechanism, but this is too much policy for my taste. the original PEP 340 might have been too clever, but this reduced version feels pretty pointless).
Maybe. It still solves the majority of use cases for PEP 340, most of which are try/finally abstractions. Maybe I'm overreacting to Raymond Chen's rant about flow-control macros -- but having had to maintain code once that was riddled with these, it rang very true. PEP 340 is still my favorite, but it seems there's too much opposition to it, so I'm trying to explore alternatives; at the same time I *really* dislike the complexities of some of the non-looping counterproposals (e.g. Nick Coghlan's PEP 3XX or the proposals that make every keyword associated with 'try' a method). -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
[Fredrik Lundh]
intuitively, I'm -1 on this proposal.
Just to toss in my opinion, I prefer PEP 340 over 343 as well, but not so much to give 343 a -1 from me. [SNIP - question of how to handle argument against 340 being a loop which I never totally got since you know it ahead of time so you just deal with it] [SNIP]
Maybe I'm overreacting to Raymond Chen's rant about flow-control macros -- but having had to maintain code once that was riddled with these, it rang very true.
You might be overreacting. I read the essay and I understand his arguments having been taught how to program in Scheme. But I don't think this is as serious. People will have the semantics of the block statement explained to them so how that works will be clear. And at that point they just have to track down the generator or iterator that the block statement is using. If you think about it, how is it different than the implicit iter() call on a 'for' loop along with the implicit next() call each time through the loop? Just because there is an implicit closing call back into the block generator at the end of a block statement? Doesn't seem so bad to me or that much of a stretch from the magic of a 'for' loop to be that huge of a thing. I think Raymond was reeling against arbitrary macro creation that hides flow control. We don't have that here. What we have is a clearly defined statement that does some very handy syntactic sugar for us. It doesn't feel as arbitrary as what Lisp and Scheme allow you to do.
PEP 340 is still my favorite, but it seems there's too much opposition to it, so I'm trying to explore alternatives; at the same time I *really* dislike the complexities of some of the non-looping counterproposals (e.g. Nick Coghlan's PEP 3XX or the proposals that make every keyword associated with 'try' a method).
Nick's was obviously directly against looping, but, with no offense to Nick, how many other people were against it looping? It never felt like it was a screaming mass with pitchforks but more of a "I don't love it, but I can deal" crowd. And as for the overly complex examples, that I believed stemmed from people realizing what might be possible if the statement was extended and tweaked this way or that so as to be able to do just one more thing. But that happens with every proposal; seemed like standard feature creep. The reason we have a BDFL is to tell us that yes, we could get the jumbo sized candy bar for $2 more but there is no way you will be able to finish that much chocolate before it melts all over your hands and get it all over your nice PyCon t-shirt. But then again I don't know if you got private emails asking to see if PEP 340 weighed as much as wood so it could be burned at the stake for being a witch. -Brett

Brett C. wrote:
Guido van Rossum wrote:
PEP 340 is still my favorite, but it seems there's too much opposition to it, so I'm trying to explore alternatives; at the same time I *really* dislike the complexities of some of the non-looping counterproposals (e.g. Nick Coghlan's PEP 3XX or the proposals that make every keyword associated with 'try' a method).
Nick's was obviously directly against looping, but, with no offense to Nick, how many other people were against it looping? It never felt like it was a screaming mass with pitchforks but more of a "I don't love it, but I can deal" crowd.
PEP 340 is very nice, but it became less appealing to me when I saw what it would do to "break" and "continue" statements. text = 'diamond' for fn in filenames: opening(fn) as f: if text in f.read(): print 'I found the text in %s' % fn break I think it would be pretty surprising if the break didn't stop the loop. Here's a new suggestion for PEP 340: use one keyword to start a block you don't want to loop, and a different keyword to start a block that can loop. If you specify the non-looping keyword but the block template produces more than one result, a RuntimeError results. Here is example A, a non-looping block statement using "try": text = 'diamond' for fn in filenames: try opening(fn) as f: if text in f.read(): print 'I found the text in %s' % fn break In example A, the break statement breaks the "for" loop. If the opening() iterator returns more than one result, a RuntimeError will be generated by the Python interpreter. Here is example B, a looping block statement using "in", adapted from PEP 340: in auto_retry(3, IOError) as attempt: f = urllib.urlopen("http://python.org/peps/pep-0340.html") print f.read() Note that I introduced no new keywords except "as", and the syntax in both cases is currently illegal. Shane

Shane Hathaway wrote:
Here is example A, a non-looping block statement using "try":
text = 'diamond' for fn in filenames: try opening(fn) as f: if text in f.read(): print 'I found the text in %s' % fn break
That's a pretty way to write it! Would it be possible to extend the 'try' syntax in this way? It would certainly stress the fact that this construct includes exception handling. --eric

On 5/14/05, Brett C. <bac@ocf.berkeley.edu> wrote:
Nick's was obviously directly against looping, but, with no offense to Nick, how many other people were against it looping? It never felt like it was a screaming mass with pitchforks but more of a "I don't love it, but I can deal" crowd.
Agreed. That's certainly how I felt originally. There were a *lot* of nice features with PEP 340. The initial discussion had a lot of people enthusiastic about all the neat things they could do with it. That's disappeared now, in a long series of attempts to "fix" the looping issue. No-one is looking at PEP 343, or Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with that!". This makes me feel that we've thrown out the baby with the bathwater. (Yes, I know PEP 342 is integral to many of the neat features, but I get the impression that PEP 342 is being lost - later iterations of the other two PEPs are going out of their way to avoid assuming PEP 324 is implemented...) Looping is definitely a wart. Looping may even be a real problem in some cases. There may be cases where an explicit try...finally remains better, simply to avoid an unwanted looping behaviour. But I'll live with that to get back the enthusiasm for a new feature that started all of this. Much better than the current "yes, I guess that's good enough" tone to the discussion. Paul. PS Guido - next time you get a neat idea like PEP 340, just code it and check it in. Then we can just badger you to fix the code, rather than using up all your time on discussion before there's an implementation :-)

On 5/15/05, Paul Moore <p.f.moore@gmail.com> wrote:
There were a *lot* of nice features with PEP 340. The initial discussion had a lot of people enthusiastic about all the neat things they could do with it. That's disappeared now, in a long series of attempts to "fix" the looping issue.
Having done the python-dev summary on this topic, I think the initial "enthusiasm" you were seeing included a lot of "what if we did it this way?" or "what if we extended this further in another way?" kind of stuff. When PEP 340 finally came out (around the end of the month), the more extreme ideas were discarded. So in some sense, PEP 340 was the reason for the lack of "enthusiasm"; with the semantics laid out, people were forced to deal with a specific implementation instead of a variety of wild suggestions.
No-one is looking at PEP 343, or Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with that!". This makes me feel that we've thrown out the baby with the bathwater.
I'd be surprised if you can find many examples that PEP 340 can do that PEP 3XX can't. The only real looping example we had was auto_retry, and there's a reasonably simple solution to that in PEP 3XX. You're not going to see anyone saying "hey that's neat - I can do XXX with that!" because PEP 3XX doesn't add anything. But for 95% of the cases, it doesn't take anything away either.
(Yes, I know PEP 342 is integral to many of the neat features, but I get the impression that PEP 342 is being lost - later iterations of the other two PEPs are going out of their way to avoid assuming PEP 324 is implemented...)
Not very far out of their way. I split off PEP 342 from the original PEP 340, and it was ridiculously easy for two reasons: * the concepts are very orthogonal; the only thing really used from it in any of PEP 340 was the "yield" means "yield None" thing * there weren't *any* examples of using the "continue EXPR" syntax. PEP 342 is still lacking in this spot If you want to get people enthused about PEP 342 again (which is the right way to make sure it gets excepted), what would really help is a bunch of good examples of how it could be used.
Looping is definitely a wart. Looping may even be a real problem in some cases. There may be cases where an explicit try...finally remains better, simply to avoid an unwanted looping behaviour.
But I'll live with that to get back the enthusiasm for a new feature that started all of this. Much better than the current "yes, I guess that's good enough" tone to the discussion.
I'm convinced that such a tone is inevitable after 30 days and over 700 messages on *any* topic. ;-) Ok, back to summarizing this fortnight's 380+ PEP 340 messages. ;-) STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

On 5/15/05, Steven Bethard <steven.bethard@gmail.com> wrote:
Having done the python-dev summary on this topic,
You have my deepest sympathy :-)
So in some sense, PEP 340 was the reason for the lack of "enthusiasm"; with the semantics laid out, people were forced to deal with a specific implementation instead of a variety of wild suggestions.
I'm not sure I agree with that - to me, PEP 340 felt like the consolidation of the previous discussion. My feeling was "cool - we've had the discussion, now we've formalised the results, maybe a few details to tidy up and then we can see the implementation being checked in". Then Nick's proposal *failed* to feel like the tidying up of the details, and PEP 343 felt like giving up on the powerful (but hard) bits. It's all people's impressions, though, so maybe I'm just bitter and cynical :-) Interestingly, some new ideas have started appearing again (the GUI example someone raised yesterday, for instance). But with the current "multiple PEPs" situation, I can't evaluate such ideas, as I've no clue which of the various proposals would support them.
No-one is looking at PEP 343, or Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with that!". This makes me feel that we've thrown out the baby with the bathwater.
I'd be surprised if you can find many examples that PEP 340 can do that PEP 3XX can't.
In which cask, Nick is "marketing" it really badly - I hadn't got that impression at all. And if Nick's proposal really *is* PEP 340 with the issues people had resolved, how come Guido isn't supporting it? (By the way, I agree with Philip Eby - Nick's proposal really needs to be issued as a proper PEP - although if it's that close to just being a fix for PEP 340, it should probably just be the new version of PEP 340).
(Yes, I know PEP 342 is integral to many of the neat features, but I get the impression that PEP 342 is being lost - later iterations of the other two PEPs are going out of their way to avoid assuming PEP 324 is implemented...)
Not very far out of their way.
Well, PEP 343 uses def template(): before yield after rather than def template(): before try: yield finally: after which I would argue is better - but it needs PEP 342 functionality. OTOH, Guido argues there are other reasons for the PEP 343 style. Also, the discussion has moved to resource objects with special methods rather than generators as templates - which I see as a direct consequence of PEP 342 being excluded. One of the things I really liked about PEP 340 was the "generator template" style of code, with yield as the "block goes here" placeolder
If you want to get people enthused about PEP 342 again (which is the right way to make sure it gets excepted), what would really help is a bunch of good examples of how it could be used.
In my view, *every* PEP 340/343/3XX example when written in generator form counts as a good example (see above). Neat coroutine tricks and the like are the "additional benefits" - maybe bordering on abuses in some cases, so I don't want to focus on them in case we get into arguments about whether a feature is bad simply because it *can* be abused... But the key use case, for me, *is* the generator-as-template feature.
I'm convinced that such a tone is inevitable after 30 days and over 700 messages on *any* topic. ;-)
Which is why I regret that Guido didn't just go ahead and implement it, consensus be damned :-) I vote for a dictatorship :-)
Ok, back to summarizing this fortnight's 380+ PEP 340 messages. ;-)
Best of luck - and in case you need a motivation boost, can I just say that this sort of thread is why I have the greatest respect and appreciation for the job the summarisers do. Paul.

Paul Moore wrote:
I'm not sure I agree with that - to me, PEP 340 felt like the consolidation of the previous discussion. My feeling was "cool - we've had the discussion, now we've formalised the results, maybe a few details to tidy up and then we can see the implementation being checked in". Then Nick's proposal *failed* to feel like the tidying up of the details, and PEP 343 felt like giving up on the powerful (but hard) bits. It's all people's impressions, though, so maybe I'm just bitter and cynical :-)
Actually, I agree that the early versions of my proposal failed to tidy things up (I was honestly trying - I just didn't succeed). I'm pretty happy with the version I just submitted to the PEP editors, though: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html It gives generators inherent ``with`` statement support in order to handle finalisation (similar to the way it suggests files support the ``with`` statement in addition to ``for`` loops). An appropriately defined decorator then provides the ability to write ``with`` statement templates using the PEP 340 generator-based style.
I'd be surprised if you can find many examples that PEP 340 can do that PEP 3XX can't.
In which cask, Nick is "marketing" it really badly - I hadn't got that impression at all. And if Nick's proposal really *is* PEP 340 with the issues people had resolved, how come Guido isn't supporting it?
I think this is because I've been changing it too fast. This is also the main reason I hadn't submitted before now - *I* wasn't happy with it, so I wanted to keep it easy to update for a while longer. I hope that the version that appears on python.org will garner a bit more support. I submitted it because I finally felt I'd achieved what I set out to do (cleanly integrate PEP 310 and PEP 340), whereas none of my previous drafts felt that way.
Also, the discussion has moved to resource objects with special methods rather than generators as templates - which I see as a direct consequence of PEP 342 being excluded. One of the things I really liked about PEP 340 was the "generator template" style of code, with yield as the "block goes here" placeolder
My PEP suggests simple cases are best handled by __enter__/__exit__ methods directly on the classes (for efficiency, ease of use, and to avoid cluttering the builtin namespace), but more complex cases (like acquiring two locks, or using a particular lock to protect access to a certain file) be handled using PEP 340 style generator templates.
In my view, *every* PEP 340/343/3XX example when written in generator form counts as a good example (see above). Neat coroutine tricks and the like are the "additional benefits" - maybe bordering on abuses in some cases, so I don't want to focus on them in case we get into arguments about whether a feature is bad simply because it *can* be abused... But the key use case, for me, *is* the generator-as-template feature.
PEP 342 isn't about the ability to inject exceptions into generators - it's about yield expressions and enhanced continue statements. We haven't actually identified any use cases for those yet (I'm wondering if there is some way to use the idea to implement accumulators, but getting the final result out is a real challenge) PEP 340 and my PEP are the two PEP's which talk about injecting exceptions in order to write nice statement templates. PEP 288 is the one which would make that mechanism available to Python code.
Which is why I regret that Guido didn't just go ahead and implement it, consensus be damned :-) I vote for a dictatorship :-)
Nah, we'll get something good at the end of it anyway - and this way we'll *know* it's good :)
Ok, back to summarizing this fortnight's 380+ PEP 340 messages. ;-)
Best of luck - and in case you need a motivation boost, can I just say that this sort of thread is why I have the greatest respect and appreciation for the job the summarisers do.
Oyah. I had a hard enough time just identifying all the ideas I wanted to discuss in the "Rejected Options" section of my PEP. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Paul Moore wrote:
On 5/14/05, Brett C. <bac@ocf.berkeley.edu> wrote:
Nick's was obviously directly against looping, but, with no offense to Nick, how many other people were against it looping? It never felt like it was a screaming mass with pitchforks but more of a "I don't love it, but I can deal" crowd.
Agreed. That's certainly how I felt originally.
Oh good. So I am not nuts. =)
There were a *lot* of nice features with PEP 340. The initial discussion had a lot of people enthusiastic about all the neat things they could do with it. That's disappeared now, in a long series of attempts to "fix" the looping issue. No-one is looking at PEP 343, or Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with that!". This makes me feel that we've thrown out the baby with the bathwater. (Yes, I know PEP 342 is integral to many of the neat features, but I get the impression that PEP 342 is being lost - later iterations of the other two PEPs are going out of their way to avoid assuming PEP 324 is implemented...)
My feelings exactly. I was really happy and excited when it seemed like everyone really liked PEP 340 sans a few disagreements on looping and other things. Having a huge chunk of people get excited and liking a proposal was a nice contrast to the whole decorator debate.
Looping is definitely a wart. Looping may even be a real problem in some cases. There may be cases where an explicit try...finally remains better, simply to avoid an unwanted looping behaviour.
Which I think is actually fine if they do just use a try/finally if it fits the situation better.
But I'll live with that to get back the enthusiasm for a new feature that started all of this. Much better than the current "yes, I guess that's good enough" tone to the discussion.
Ditto. -Brett

Paul Moore wrote:
Looping is definitely a wart. Looping may even be a real problem in some cases. There may be cases where an explicit try...finally remains better, simply to avoid an unwanted looping behaviour.
I agree PEP 343 throws away too much that was good about PEP 340 - that's why I'm still updating PEP 3XX as the discussion continues. But is there anything PEP 340 does that PEP 3XX doesn't, other than letting you suppress exceptions? The only example the latest version of PEP 3XX drops since the original PEP 340 is auto_retry - and that suffers from the hidden control flow problem, so I doubt Guido would permit it, even *if* the new statement was once again a loop. And if the control flow in response to an exception can't be affected, it becomes even *harder* to explain how the new statement differs from a standard for loop.
But I'll live with that to get back the enthusiasm for a new feature that started all of this. Much better than the current "yes, I guess that's good enough" tone to the discussion.
I think the current tone is more due to the focus on addressing problems with all of the suggestions - that's always going to dampen enthusiasm. Every PEP 340 use case that doesn't involve suppressing exceptions (i.e. all of them except auto_retry) can be written under the current PEP 3XX using essentially *identical* generator code (the only difference is the statement_template decorator at the top of the generator definition) Pros of PEP 3XX 1.6 vs PEP 340: - control flow is easier to understand - can use inside loops without affecting break/continue - easy to write enter/exit methods directly on classes - template generators can be reused safely - iterator generator resource management is dealt with Cons of PEP 3XX 1.6 vs PEP 340: - no suppression of exceptions (see first pro, though) Of course, I may be biased :) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

At 08:18 AM 5/16/2005 +1000, Nick Coghlan wrote:
Paul Moore wrote:
Looping is definitely a wart. Looping may even be a real problem in some cases. There may be cases where an explicit try...finally remains better, simply to avoid an unwanted looping behaviour.
I agree PEP 343 throws away too much that was good about PEP 340 - that's why I'm still updating PEP 3XX as the discussion continues.
Could you please stop calling it PEP 3XX and go ahead and submit it as a real PEP? Either that, or post its URL *every* time you mention it, because at this point I don't know where to go to read it, and the same applies for each new person to enter the discussion. Thanks.

Phillip J. Eby wrote:
At 08:18 AM 5/16/2005 +1000, Nick Coghlan wrote:
Paul Moore wrote:
Looping is definitely a wart. Looping may even be a real problem in some cases. There may be cases where an explicit try...finally remains better, simply to avoid an unwanted looping behaviour.
I agree PEP 343 throws away too much that was good about PEP 340 - that's why I'm still updating PEP 3XX as the discussion continues.
Could you please stop calling it PEP 3XX and go ahead and submit it as a real PEP? Either that, or post its URL *every* time you mention it, because at this point I don't know where to go to read it, and the same applies for each new person to enter the discussion. Thanks.
Sorry about that - I've been including the URL most of the time, but forgot on this occasion: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html Anyway, I think it's stable enough now that I can submit it to be put up on www.python.org (I'll notify the PEP editors directly once I fix a couple of errors in the current version - like the missing 'raise' in the statement semantics. . .). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Brett C. wrote:
Nick's was obviously directly against looping, but, with no offense to Nick, how many other people were against it looping? It never felt like it was a screaming mass with pitchforks but more of a "I don't love it, but I can deal" crowd.
My problem with looping was that, with it, the semantics of a block statement would be almost, but not quite, exactly like those of a for-loop, which seems to be flying in the face of TOOWTDI. And if it weren't for the can't-finalise-generators-in-a-for-loop backward compatibility problem, the difference would be even smaller. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

Greg Ewing wrote:
Brett C. wrote:
Nick's was obviously directly against looping, but, with no offense to Nick, how many other people were against it looping? It never felt like it was a screaming mass with pitchforks but more of a "I don't love it, but I can deal" crowd.
My problem with looping was that, with it, the semantics of a block statement would be almost, but not quite, exactly like those of a for-loop, which seems to be flying in the face of TOOWTDI. And if it weren't for the can't-finalise-generators-in-a-for-loop backward compatibility problem, the difference would be even smaller.
I wonder if we should reconsider PEP 340, with one change: the block iterator is required to iterate exactly once. If it iterates more than once or not at all, the interpreter raises a RuntimeError, indicating the iterator can not be used as a block template. With that change, 'break' and 'continue' will obviously affect 'for' and 'while' loops rather than block statements. Advantages of PEP 340, with this change, over PEP 343: - we reuse a protocol rather than invent a new protocol. - decorators aren't necessary. - it's a step toward more general flow control macros. At first I wasn't sure people would like the idea of requiring iterators to iterate exactly once, but I just realized the other PEPs have the same requirement. Shane

On Mon, May 16, 2005 at 06:24:59PM +1200, Greg Ewing wrote:
Brett C. wrote:
Nick's was obviously directly against looping, but, with no offense to Nick, how many other people were against it looping? It never felt like it was a screaming mass with pitchforks but more of a "I don't love it, but I can deal" crowd.
My problem with looping was that, with it, the semantics of a block statement would be almost, but not quite, exactly like those of a for-loop, which seems to be flying in the face of TOOWTDI. And if it weren't for the can't-finalise-generators-in-a-for-loop backward compatibility problem, the difference would be even smaller.
Nodders, the looping construct seemed to work out fine as code people could use to get their heads around the idea. It was eye-gougingly bad as final solution. Forcing people to write an iterator for something that will almost never loop is as awkward as forcing everyone to write "if" statements as for dummy in range(1): if (somevar): do_true_stuff() break else: do_false_stuff() I still haven't gotten used to Guido's heart-attack inducing early enthusiasm for strange things followed later by a simple proclamation I like. Some day I'll learn that the sound of fingernails on the chalkboard is frequently followed by candy for the whole class. For now the initial stages still give me the shivers. -jackdied

At 08:21 PM 5/16/2005 -0400, Jack Diederich wrote:
I still haven't gotten used to Guido's heart-attack inducing early enthusiasm for strange things followed later by a simple proclamation I like. Some day I'll learn that the sound of fingernails on the chalkboard is frequently followed by candy for the whole class.
Heh. +1 for QOTW. :)

Phillip J. Eby wrote:
At 08:21 PM 5/16/2005 -0400, Jack Diederich wrote:
I still haven't gotten used to Guido's heart-attack inducing early enthusiasm for strange things followed later by a simple proclamation I like. Some day I'll learn that the sound of fingernails on the chalkboard is frequently followed by candy for the whole class.
Heh. +1 for QOTW. :)
Indeed. Particularly since it sounds like son-of-PEP-343 will be the union of PEP 310 and PEP 340 that I've been trying to figure out :) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Guido van Rossum wrote:
[Fredrik Lundh]
unlike the original design, all you get from this is the ability to add try/finally blocks to your code without ever writing a try/finally-clause (neither in your code or in the block controller). that doesn't strike me as especially pythonic.
Would it be better if we pulled back in the generator exit handling from PEP 340? That's a pretty self-contained thing, and would let you write try/finally around the yield.
That would be good, in my opinion. I updated PEP 3XX to use this idea: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting exceptions that occur into the template generator's internal frame instead of invoking next(). The rest of the PEP is then about dealing with the implications of allowing yield inside try/finally statements. The Rejected Options section tries to look at all the alternatives brought up in the various PEP 310, 340 and 343 discussions, and explain why PEP 3XX chooses the way it does. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Nick Coghlan wrote:
That would be good, in my opinion. I updated PEP 3XX to use this idea: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting exceptions that occur into the template generator's internal frame instead of invoking next().
The rest of the PEP is then about dealing with the implications of allowing yield inside try/finally statements.
You might add to the PEP the following example, which could really improve the process of building GUIs in Python: class MyFrame(Frame): def __init__(self): with Panel(): with VerticalBoxSizer(): self.text = TextEntry() self.ok = Button('Ok') Indentation improves the readability of code that creates a hierarchy. Shane

Shane Hathaway <shane@hathawaymix.org> wrote:
Nick Coghlan wrote:
That would be good, in my opinion. I updated PEP 3XX to use this idea: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting exceptions that occur into the template generator's internal frame instead of invoking next().
The rest of the PEP is then about dealing with the implications of allowing yield inside try/finally statements.
You might add to the PEP the following example, which could really improve the process of building GUIs in Python:
class MyFrame(Frame): def __init__(self): with Panel(): with VerticalBoxSizer(): self.text = TextEntry() self.ok = Button('Ok')
Indentation improves the readability of code that creates a hierarchy.
I've generally been fairly ambiguous about the entire PEP 310/340/343 issue. While resource allocation and release is quite useful, not a whole heck of a lot of my code has to deal with it. But after seeing this sketch of an example for its use in GUI construction; I must have it! +1 for: - 'do' keyword (though 'try' being reused scratches the 'no new keyword' itch for me, and is explicit about "I won't loop") - PEP 343 - try/finally + @do_template with generators - Josiah

On Sun, 15 May 2005, Shane Hathaway wrote:
You might add to the PEP the following example, which could really improve the process of building GUIs in Python:
class MyFrame(Frame): def __init__(self): with Panel(): with VerticalBoxSizer(): self.text = TextEntry() self.ok = Button('Ok')
I don't understand how this would be implemented. Would a widget function like 'TextEntry' set the parent of the widget according to some global 'parent' variable? If so, how would 'Panel' know that its parent is supposed to be the 'MyFrame' object? -- ?!ng

Ka-Ping Yee <python-dev@zesty.ca> wrote:
On Sun, 15 May 2005, Shane Hathaway wrote:
You might add to the PEP the following example, which could really improve the process of building GUIs in Python:
class MyFrame(Frame): def __init__(self): with Panel(): with VerticalBoxSizer(): self.text = TextEntry() self.ok = Button('Ok')
I don't understand how this would be implemented. Would a widget function like 'TextEntry' set the parent of the widget according to some global 'parent' variable? If so, how would 'Panel' know that its parent is supposed to be the 'MyFrame' object?
It would actually take a bit more to make work properly. If those objects were aware of the resource allocation mechanism, they could add and remove themselves from a context stack as necessary. In the case of things like VerticalBoxSizer, save the current self dictionary on entrance, then check for changes on exit, performing an Add with all the new objects. Or, so that it doesn't change the way wxPython works with other versions of Python, everything could be wrapped, perhaps using something like... class MyFrame(Frame): def __init__(self): with new_context(self): with parented(Panel, self) as panel: with unparented(VerticalBoxSizer, 'Add') as panel.sizer: self.text = TextEntry(panel) self.ok = Button(panel, 'Ok') There would be a little more work involved to generate a reasonable API for this, but it is all possible. - Josiah

Ka-Ping Yee wrote:
On Sun, 15 May 2005, Shane Hathaway wrote:
You might add to the PEP the following example, which could really improve the process of building GUIs in Python:
class MyFrame(Frame): def __init__(self): with Panel(): with VerticalBoxSizer(): self.text = TextEntry() self.ok = Button('Ok')
I don't understand how this would be implemented. Would a widget function like 'TextEntry' set the parent of the widget according to some global 'parent' variable? If so, how would 'Panel' know that its parent is supposed to be the 'MyFrame' object?
Try this version, which I sent to Nick earlier: class MyFrame(Frame): def __init__(self): with Panel(self): with VerticalBoxSizer(self): self.text = TextEntry(self) self.ok = Button(self, 'Ok') The 'self' parameter tells the component to add itself to the current parent inside the Frame. The current parent is a temporary variable set by 'with' statements. Outside any 'with' statement, the current parent is the frame. There is only a little magic. Maybe someone can find an even less magical pattern, but it's a lot easier to read and write than the status quo: class MyFrame(Frame): def __init__(self): p = Panel() self.add(p) sizer = VerticalBoxSizer(p) p.add(sizer) self.text = TextEntry() sizer.add(self.text) self.ok = Button('Ok') sizer.add(self.ok) Shane

On 5/15/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
From there I see the semantics:
VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause exc = (None, None, None) try: try: BLOCK1 except: exc = sys.exc_info() finally: stmt_exit(*exc) Don't you want a "raise" after the "exc = sys.exc_info()"? STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

Steven Bethard wrote:
On 5/15/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
From there I see the semantics:
VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause exc = (None, None, None) try: try: BLOCK1 except: exc = sys.exc_info() finally: stmt_exit(*exc)
Don't you want a "raise" after the "exc = sys.exc_info()"?
Oops. . . yeah, that must have gotten lost somewhere along the way. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Nick Coghlan]
[Steven Bethard]
there I see the semantics:
VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause exc = (None, None, None) try: try: BLOCK1 except: exc = sys.exc_info() finally: stmt_exit(*exc)
Don't you want a "raise" after the "exc = sys.exc_info()"?
I have the same question for Nick. Interestingly, assuming Nick meant that "raise" to be there, PEP 3XX and PEP 343 now have the same translation. In rev 1.10 I moved the __enter__ call out of the try-block again. Having it inside was insane: when __enter__ fails, it should do its own cleanup rather than expecting __exit__ to clean up after a partial __enter__. But some of the claims from PEP 3XX seem to be incorrect now: Nick claims that a with-statement can abstract an except clause, but that's not the case; an except clause causes the control flow to go forward (continue after the whole try statement) but the with-statement (with the "raise" added) always acts like a finally-clause, which implicitly re-raises the exception. So, in particular, in this example: with EXPR1: 1/0 print "Boo" the print statement is unreachable and there's nothing clever you can put in an __exit__ method to make it reachable. Just like in this case: try: 1/0 finally: BLOCK1 print "Boo" there's nothing that BLOCK1 can to to cause the print statement to be reached. This claim in PEP 3XX may be a remnant from a previous version; or it may be that Nick misunderstands how 'finally' works. Anyway, I think we may be really close at this point, if we can agree on an API for passing exceptions into generators and finalizing them, so that the generator can be written using a try/finally around the yield statement. Of course, it's also possible that Nick did *not* mean for the missing "raise" to be there. But in that case other claims from his PEP become false, so I'm assuming with Steven here. Nick? -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
[...] But some of the claims from PEP 3XX seem to be incorrect now: Nick claims that a with-statement can abstract an except clause, but that's not the case; [...]
Sorry for being a lurker, but can I try and expand this point. The options: - If we don't allow the except clause in the generator, the exception can't be examined there. - If we do allow the except clause we must (IMO) also allow the generator to suppress the exception. It would be surprising behaviour if an a caught exception was re-raised without an explicit raise statement. An argument: Despite the control-flow-macros-are-harmful discussion, I see nothing wrong with a block controller swallowing its block's exceptions because: - In most proposals it can raise its own exception in place of the block's exception anyway. - In the following example there is nothing surprising if controller() swallows block()'s exception: def block(): # do stuff raise E controller(block) Perhaps we don't want the block controller statement to have as much power over its block as controller() has over block() above. But handling an exception is not so radical is it? - Arnold.

On Sun, 15 May 2005, Guido van Rossum wrote:
In rev 1.10 I moved the __enter__ call out of the try-block again. Having it inside was insane: when __enter__ fails, it should do its own cleanup rather than expecting __exit__ to clean up after a partial __enter__.
No, it wasn't insane. You had a good reason for putting it there. The question is what style of implementation you want to encourage. If you put __enter__ inside, then you encourage idempotent __exit__, which makes resource objects easier to reuse. If you put __enter__ outside, that allows the trivial case to be written a little more simply, but also makes it hard to reuse. -- ?!ng

[Guido]
In rev 1.10 I moved the __enter__ call out of the try-block again. Having it inside was insane: when __enter__ fails, it should do its own cleanup rather than expecting __exit__ to clean up after a partial __enter__.
[Ka-Ping Yee]
No, it wasn't insane. You had a good reason for putting it there.
I did some introspection, and it was definitely a temporary moment of insanity: (1) I somehow forgot the obvious solution (that __enter__ should clean up its own mess if it doesn't make it to the finish line); (2) once a generator raises an exception it cannot be resumed, so the generator-based example I gave can't work. (I was too lazy to write down the class-based example, which *can* be made to work of course.)
The question is what style of implementation you want to encourage.
If you put __enter__ inside, then you encourage idempotent __exit__, which makes resource objects easier to reuse.
But consider threading.RLock (a lock with the semantics of Java's monitors). Its release() is *not* idempotent, so we couldn't use the shortcut of making __enter__ and __exit__ methods of the lock itself. (I think this shortcut may be important for locks because it reduces the overhead of frequent locking -- no extra objects need to be allocated.)
If you put __enter__ outside, that allows the trivial case to be written a little more simply, but also makes it hard to reuse.
I skimmed your longer post about that but didn't find the conclusive evidence that this is so; all I saw was a lot of facts ("you can implement scenario X in these three ways) but no conclusion. The real reason I put it inside the try was different: there's a race condition in the VM where it can raise a KeyboardInterrupt after the __enter__() call completes but before the try-suite is entered, and then __exit__() is never called. Similarly, if __enter__() is written in Python and wraps some other operation, it may complete that other operation but get a KeyboardInterrupt (or other asynchronous exception) before reaching the (explicit or implicit) return statement. Ditto for generators and yield. But I think this can be solved differently in the actual translation (as opposed to the "translate-to-valid-pre-2.5-Python"); the call to __exit__ can be implicit in a new opcode, and then we can at least guarantee that the interpreter doesn't check for interrupts between a successful __exit__ call and setting up the finally block. This doesn't handle an __enter__ written in Python not reaching its return; but in the presence of interrupts I don't think it's possible to write such code reliably anyway; it should be written using a "with signal.blocking()" around the critical code including the return. I don't want to manipulate signals directly in the VM; it's platform-specific, expensive, rarely needed, and you never know whether you aren't invoking some Python code that might do I/O, making the entire thread uninterruptible for a long time. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
[Nick Coghlan]
I have the same question for Nick. Interestingly, assuming Nick meant that "raise" to be there, PEP 3XX and PEP 343 now have the same translation. In rev 1.10 I moved the __enter__ call out of the try-block again. Having it inside was insane: when __enter__ fails, it should do its own cleanup rather than expecting __exit__ to clean up after a partial __enter__.
Are you sure? The copy I see on python.org still has it inside the try/finally. But yes, the differences between PEP 343 and PEP 3XX [1] are not huge, particularly if __enter__ is called outside the try/finally block. The key difference is whether or not exceptions are injected into the generators internal frame so that templates can be written using the style from PEP 340.
But some of the claims from PEP 3XX seem to be incorrect now: Nick claims that a with-statement can abstract an except clause, but that's not the case; an except clause causes the control flow to go forward (continue after the whole try statement) but the with-statement (with the "raise" added) always acts like a finally-clause, which implicitly re-raises the exception.
Steven's correct - there's a raise statement missing. The point I'm trying to make in the PEP is that, even without the ability to suppress exceptions, the __exit__() statement can still react to them. Then the only code that needs to be repeated at the calling site is the actual suppression of the exception. Whether doing such a thing makes sense is going to be application dependent, of course.
Anyway, I think we may be really close at this point, if we can agree on an API for passing exceptions into generators and finalizing them, so that the generator can be written using a try/finally around the yield statement.
My PEP punts on providing a general API for passing exceptions into generators by making it an internal operation. The version I submitted to the PEP editors uses __enter__() and __exit__() to handle finalisation, though. Cheers, Nick. [1] I finally submitted it to the PEP editors, so it'll be up on python.org as soon as they find the time to check it in. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

At 09:53 PM 5/16/2005 +1000, Nick Coghlan wrote:
My PEP punts on providing a general API for passing exceptions into generators by making it an internal operation.
Actually, the proposals you made almost subsume PEPs 288 and 325. All you'd need to do is: 1. move the '__del__' code to a 'close()' method and make __del__ call close() 2. make '_inject_exception()' a public API that returns the next yielded value if the exception doesn't propagate And just like that you've cleaned up the open issues from both 288 and 325, IIRC those open issues correctly. I personally think that StopIteration, TerminateIteration, KeyboardInterrupt and perhaps certain other exceptions should derive from some base class other than Exception (e.g. Raisable or some such) to help with the bare-except/except Exception problem. But that's probably best addressed by a separate PEP. :)

In article <5.1.1.6.0.20050516132944.01ee2c98@mail.telecommunity.com>, "Phillip J. Eby" <pje@telecommunity.com> wrote:
... I personally think that StopIteration, TerminateIteration, KeyboardInterrupt and perhaps certain other exceptions should derive from some base class other than Exception (e.g. Raisable or some such) to help with the bare-except/except Exception problem. But that's probably best addressed by a separate PEP. :)
Yes, please!!! I am so sick of writing: except (SystemExit, KeyboardInterrupt): raise except Exception, e: # handle a "real" error but back to lurking on this discussion... -- Russell

It's been interesting watching all the loops this discussion has gone through. I'm not sure the following is compatible with the current proposals, but maybe it will spur some ideas or help rule out something. There have been several examples of problems with opening several resources inside an enter method and how to resolve them in the case of an incomplete entry. So it seams to me, each resource needs to be handled separately, but that doesn't mean you can't use them as a group. So I was wondering if something like the following is feasible? # open-close pairing def opening(filename, mode): def openfile(): f = open(filename, mode) try: yield f finally: f.close() return openfile with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3: # do stuff with files The 'with' (or whatever) statement would need a little more under the hood, but it might simplify handling multiple resources. This also reduces nesting in cases such as locking and opening. Both must succeed before the block executes. And if something goes wrong, the "with" statement knows and can handle each resource. The point is, each resource needs to be a whole unit, opening multiple files in one resource handler is probably not a good idea anyway. Regards, _Ron Adam

A additional comment (or 2) on my previous message before I go back to lurk mode. If the recommended use of each resource template is kept to a single resource, then each enter and exit can be considered a whole block of code that will either pass or fail. You can then simplify the previous template to just: # open-close pairing def opening(filename, mode): def openfile(): f = open(filename, mode) yield f f.close() return openfile with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3: # do stuff with files The with statement will need to catch any opening errors at the start in case one of the opening resources fails. Close any opened resources, then re raise the first exception. The with statement will also need to catch any uncaught exceptions in the block, close any opened resources, then re raise the exception. And again when closing, it will need to catch any exceptions that occur until it has tried to close all open resources, then re raise the first exception. Although it's possible to have more than one exception occur, it should always raise the first most one as any secondary exceptions may just be a side effect of the first one. The programmer has the option to surround the 'with' block with a try except if he want to catch any exceptions raised. He should also be able to put try excepts before the yield, and after the yield, or in the block. (But not surrounding the yield, I think.) Of course he may cause himself more problems than not, but that should be his choice, and maybe he thought of some use. This might be an acceptable use case of try-except in the enter section. def openfile(firstfile, altfile, mode): try: f = open(firstfile, mode) except: f = open(altfile, mode) yield f f.close() return openfile This is still a single open resource (if it succeeds) so should work ok. Alternate closing could be possible, maybe retrying the close several time before raising it and letting the 'with' handle it. Cheers, _Ron

Ron Adam wrote:
He should also be able to put try excepts before the yield, and after the yield, or in the block. (But not surrounding the yield, I think.)
I was given to understand that yield is currently allowed in try-except, just not try-finally. So this would require a non-backwards-compatible change. Greg

[Ron Adam]
So I was wondering if something like the following is feasible?
[...]
with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3: # do stuff with files
The 'with' (or whatever) statement would need a little more under the hood, but it might simplify handling multiple resources.
This also reduces nesting in cases such as locking and opening. Both must succeed before the block executes. And if something goes wrong, the "with" statement knows and can handle each resource. The point is, each resource needs to be a whole unit, opening multiple files in one resource handler is probably not a good idea anyway.
I'm -0 on this, if only because it complicates things a fair bit for a very minor improvement in functionality. There are also some semantic/syntactic questions: should this work only if there are explicit commas in the with-statement (so the compiler can generate code equivalent to nested with-statements) or should it allow a single expression to return a tuple of resource managers dynamically (so the run-time must check fora tuple each time)? It's always something we can add in the future, since it's guaranteed that this syntax (or a tuple value) is invalid in the curernt proposal. So I'd rather punt on this. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
[Ron Adam]
with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3: # do stuff with files
I'm -0 on this, if only because it complicates things a fair bit for a very minor improvement in functionality. [...] It's always something we can add in the future, since it's guaranteed that this syntax (or a tuple value) is invalid in the curernt proposal. So I'd rather punt on this.
Also, you can already get 90% of this with the combining() wrapper I posted earlier. with combining(opening(file1,m),opening(file2,m),opening(file3,m) ) as f1,f2,f3: # do stuff with files Shane

On 5/15/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
In reading over PEP 3XX again, it struck me that I'd been having a really hard time grasping exactly when I needed to use the "needs_finish" decorator. Am I right in saying that I should use the "needs_finish" decorator every time I have a "yield" inside a with-statement or a try/finally? Are there other situations where I might need the "needs_finish" decorator? If it's true that I need the "needs_finish" decorator every time I have a "yield" inside a with-statement or a try/finally, I'd be inclined to do this automatically. That is, since a yield inside a with-statement or try/finally can be determined lexically (heck, we do it now to disallow it), generators that have such code should be automatically wrapped with the "needs_finish" decorator, i.e. they should automatically acquire a __finish__ method. If I've misunderstood, and there are other situations when "needs_finish" is required, it'd be nice to see some more examples. STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

Steven Bethard wrote:
If I've misunderstood, and there are other situations when "needs_finish" is required, it'd be nice to see some more examples.
The problem is try/except/else blocks - those are currently legal, so the programmer has to make the call about whether finalisation is needed or not. I'll put this in the Open Issues section of the PEP - doing it lexically seems a little too magical for my taste (since it suddenly becomes more difficult to do partial iteration on the generator), but the decorator is a definite wart. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Nick Coghlan wrote:
Steven Bethard wrote:
If I've misunderstood, and there are other situations when "needs_finish" is required, it'd be nice to see some more examples.
The problem is try/except/else blocks - those are currently legal, so the programmer has to make the call about whether finalisation is needed or not.
I'll put this in the Open Issues section of the PEP - doing it lexically seems a little too magical for my taste (since it suddenly becomes more difficult to do partial iteration on the generator), but the decorator is a definite wart.
I had a better idea - with a decorator being used to create statement templates out of generators, that means the __enter__() and __exit__() methods of generators themselves are available to handle finalisation. So, just as my PEP suggests that files be usable like: with open(filename) as f: for line in f: print line It now suggests generators be usable like: with all_lines(filenames) as lines: for line in lines: print line Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Guido (responding to Fredrik Lundh's "intuitive -1" on PEP 343)]
Would it be better if we pulled back in the generator exit handling from PEP 340? That's a pretty self-contained thing, and would let you write try/finally around the yield.
[Nick Coghlan]
That would be good, in my opinion. I updated PEP 3XX to use this idea: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting exceptions that occur into the template generator's internal frame instead of invoking next().
I'm in favor of the general idea, but would like to separate the error injection and finalization API for generators into a separate PEP, which would then compete with PEP 288 and PEP 325. I think the API *should* be made public if it is available internally; I don't see any implementation reasons why it would simplify the implementation if it was only available internally. Here are some issues to resolve in such a PEP. - What does _inject_exception() return? It seems that it should raise the exception that was passed into it, but I can't find this written out explicitly. - What should happen if a generator, in a finally or except clause reached from _inject_exception(), executes another yield? I'm tempted to make this a legitimate outcome (from the generator's perspective) and reserve it for some future statement that implements the looping semantics of PEP 340; the *_template decorator's wrapper class however should consider it an error, just like other protocol mismatches like not yielding the first time or yielding more than once in response to next(). Nick's code in fact does all this right, Ijust think it should be made explicit. - TerminateIteration is a lousy name, since "terminate" means about the same as "stop", so there could be legitimate confusion with StopIteration. In PEP 340 I used StopIteration for this purpose, but someone explained this was a poor choice since existing generators may contain code that traps StopIteration for other purposes. Perhaps we could use SystemExit for this purpose? Pretty much everybody is supposed to let this one pass through since its purpose is only to allow cleanup upon program (or thread) exit. - I really don't like reusing __del__ as the method name for any kind of destructor; __del__ has all sorts of special semantics (the GC treats objects with a __del__ method specially). - The all_lines() example jars me. Somehow it bugs me that its caller has to remember to care about finalizing it. Maybe it's just not a good example; I don't see what this offers over just writing the obviously correct: for fn in filenames: with opening(fn) as f: for line in f: update_config(line) even if using the template saves a line. I doubt the use case comes up frequently enough to warrant abstracting it out. If I have to import the template to save a line here, I just made my program a bit less readable (since the first-time reader has to look up the definition of the imported all_lines template) and I didn't even save a line unless the template is used at least twice. (Note that Nick's PEP contains two typos here -- it says "print lines" where it should say "print line" and a bit later "print f" where again it should say "print line".) -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
[Nick Coghlan]
That would be good, in my opinion. I updated PEP 3XX to use this idea: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting exceptions that occur into the template generator's internal frame instead of invoking next().
I'm in favor of the general idea, but would like to separate the error injection and finalization API for generators into a separate PEP, which would then compete with PEP 288 and PEP 325.
Without that it pretty much devolves into the current version of PEP 343, though (as far as I can tell, the two PEP's now agree on the semantics of with statements)
I think the API *should* be made public if it is available internally; I don't see any implementation reasons why it would simplify the implementation if it was only available internally.
If it's internal, we don't need to name it immediately - working out the public API can then be decoupled from the ability to use generators to write resource managers.
Here are some issues to resolve in such a PEP.
- What does _inject_exception() return? It seems that it should raise the exception that was passed into it, but I can't find this written out explicitly.
That's a good point. The intent was for it to be equivalent to the exception being reraised at the point of the last yield. At that point, control flows like it would for a call to next() with code inside the generator that looked like: yield exc_type, value, tb = _passed_in_exception() raise exc_type, value, tb
- What should happen if a generator, in a finally or except clause reached from _inject_exception(), executes another yield? I'm tempted to make this a legitimate outcome (from the generator's perspective) and reserve it for some future statement that implements the looping semantics of PEP 340; the *_template decorator's wrapper class however should consider it an error, just like other protocol mismatches like not yielding the first time or yielding more than once in response to next(). Nick's code in fact does all this right, Ijust think it should be made explicit.
Yep, that was the intent - you're correct that describing it in the text as well would make it clear that this is deliberate.
- TerminateIteration is a lousy name, since "terminate" means about the same as "stop", so there could be legitimate confusion with StopIteration. In PEP 340 I used StopIteration for this purpose, but someone explained this was a poor choice since existing generators may contain code that traps StopIteration for other purposes. Perhaps we could use SystemExit for this purpose? Pretty much everybody is supposed to let this one pass through since its purpose is only to allow cleanup upon program (or thread) exit.
Wouldn't that mean we run the risk of suppressing a *real* SystemExit if it occurs while a generator is being finalised? Perhaps a new exception IteratorExit, which is a subclass of SystemExit. Then well-behaved code wouldn't trap it accidentally, and the finalisation code wouldn't?
- I really don't like reusing __del__ as the method name for any kind of destructor; __del__ has all sorts of special semantics (the GC treats objects with a __del__ method specially).
I guess if file objects can live without a __del__ method to automatically close, generators that require finalisation can survive without it.
- The all_lines() example jars me. Somehow it bugs me that its caller has to remember to care about finalizing it.
The alternative is to have some form of automatic finalisation in for loops, or else follow up on PJE's idea of making with statements allow pretty much *anything* to be used as VAR1. Automatic finalisation (like that now described in the Rejected Options section of my PEP) makes iterators/generators that manage resources 'just work' (nice) but complicates the semantics of generators (bad) and for loops (bad). The current version of my PEP means you need to know that a particular generator/iterator needs finalisation, and rearrange code to cope with that (bad), but keeps for loops simple (nice). PJE's idea about a permissive with statement may actually give the best of both worlds - you can *always* use the with statement, and if the supplied object doesn't need cleaning up, there's no more overhead than checking for the existence of methods in a couple of slots. (See my answer to Phillip on that topic for the gory details)
Maybe it's just not a good example;
I was having trouble thinking of a case where it made sense for the generator to manage its own resources. I agree all_lines isn't a great example, but its a safe bet that things like it will be written once the restriction on yielding inside try/finally is lifted. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Nick Coghlan wrote:
Wouldn't that mean we run the risk of suppressing a *real* SystemExit if it occurs while a generator is being finalised?
Perhaps a new exception IteratorExit, which is a subclass of SystemExit. Then well-behaved code wouldn't trap it accidentally, and the finalisation code wouldn't?
... inadvertently suppress a real SystemExit. Cheers, Nick. Must have been distracted halfway through that sentence :) -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Guido van Rossum]
I'm in favor of the general idea, but would like to separate the error injection and finalization API for generators into a separate PEP, which would then compete with PEP 288 and PEP 325.
[Nick Coghlan]
Without that it pretty much devolves into the current version of PEP 343, though (as far as I can tell, the two PEP's now agree on the semantics of with statements)
But that's okay, right? We can just accept both PEPs and we're done. (And PEP 342 at the same time. :-) Discussing one PEP at a time is sometimes easier, even if they are meant to be used together.
I think the API *should* be made public if it is available internally; I don't see any implementation reasons why it would simplify the implementation if it was only available internally.
If it's internal, we don't need to name it immediately - working out the public API can then be decoupled from the ability to use generators to write resource managers.
That's a minor point -- it's not like generators have a namespace for the user that we don't want to pollute, so we can pick any name we like.
Here are some issues to resolve in such a PEP.
- What does _inject_exception() return? It seems that it should raise the exception that was passed into it, but I can't find this written out explicitly.
That's a good point. The intent was for it to be equivalent to the exception being reraised at the point of the last yield. At that point, control flows like it would for a call to next() with code inside the generator that looked like:
yield exc_type, value, tb = _passed_in_exception() raise exc_type, value, tb
So, also, _inject_exception() will appear to raise the exception that was passed to it, unless the generator catches it.
- What should happen if a generator, in a finally or except clause reached from _inject_exception(), executes another yield? I'm tempted to make this a legitimate outcome (from the generator's perspective) and reserve it for some future statement that implements the looping semantics of PEP 340; the *_template decorator's wrapper class however should consider it an error, just like other protocol mismatches like not yielding the first time or yielding more than once in response to next(). Nick's code in fact does all this right, Ijust think it should be made explicit.
Yep, that was the intent - you're correct that describing it in the text as well would make it clear that this is deliberate.
OK. Please update the PEP then!
- TerminateIteration is a lousy name, since "terminate" means about the same as "stop", so there could be legitimate confusion with StopIteration. In PEP 340 I used StopIteration for this purpose, but someone explained this was a poor choice since existing generators may contain code that traps StopIteration for other purposes. Perhaps we could use SystemExit for this purpose? Pretty much everybody is supposed to let this one pass through since its purpose is only to allow cleanup upon program (or thread) exit.
Wouldn't that mean we run the risk of suppressing a *real* SystemExit if it occurs while a generator is being finalised?
D'oh. Yes.
Perhaps a new exception IteratorExit, which is a subclass of SystemExit. Then well-behaved code wouldn't trap [SystemExit] accidentally, and the finalisation code wouldn't?
Nah, I think it needs to be a brand spanking new exception. It just can't be called TerminateIteration. How about GeneratorFinalization to be utterly clear?
- I really don't like reusing __del__ as the method name for any kind of destructor; __del__ has all sorts of special semantics (the GC treats objects with a __del__ method specially).
I guess if file objects can live without a __del__ method to automatically close, generators that require finalisation can survive without it.
Right. At the C level there is of course finalization (the tp_dealloc slot in the PyTypeObject struct) but it doesn't have a Python entry point to call it. And that's intentional, since the typical code executed by that slot *really* destroys the object and must only be called when the VM is absolutely sure that the object can't be reached in any other way. if it were callable from Python, that guarantee would be void.
- The all_lines() example jars me. Somehow it bugs me that its caller has to remember to care about finalizing it.
The alternative is to have some form of automatic finalisation in for loops, or else follow up on PJE's idea of making with statements allow pretty much *anything* to be used as VAR1.
Automatic finalisation (like that now described in the Rejected Options section of my PEP) makes iterators/generators that manage resources 'just work' (nice) but complicates the semantics of generators (bad) and for loops (bad).
The current version of my PEP means you need to know that a particular generator/iterator needs finalisation, and rearrange code to cope with that (bad), but keeps for loops simple (nice).
I agree with that so far.
PJE's idea about a permissive with statement may actually give the best of both worlds - you can *always* use the with statement, and if the supplied object doesn't need cleaning up, there's no more overhead than checking for the existence of methods in a couple of slots. (See my answer to Phillip on that topic for the gory details)
But I don't like that solution -- and it's up to you to explain why (see previous exchange between Phillip & me :-).
Maybe it's just not a good example;
I was having trouble thinking of a case where it made sense for the generator to manage its own resources. I agree all_lines isn't a great example, but its a safe bet that things like it will be written once the restriction on yielding inside try/finally is lifted.
I'd rather drop the example than make one up that's questionable. It's better to look for potential use cases in existing code than just look at the proposed construct and ponder "how could I use this..." -- existing code showing a particular idiom/pattern being used repeatedly shows that there's an actual need. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
PEP 340 is still my favorite, but it seems there's too much opposition to it,
I'm not opposed to PEP 340 in principle, but the ramifications seemed to be getting extraordinarily complicated, and it seems to be hamstrung by various backwards-compatibility constraints. E.g. it seems we can't make for-loops work the way they should in the face of generator finalisation or we break old code. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

Greg Ewing <greg.ewing@canterbury.ac.nz> writes:
Guido van Rossum wrote:
PEP 340 is still my favorite, but it seems there's too much opposition to it,
I'm not opposed to PEP 340 in principle, but the ramifications seemed to be getting extraordinarily complicated, and it seems to be hamstrung by various backwards-compatibility constraints. E.g. it seems we can't make for-loops work the way they should in the face of generator finalisation or we break old code.
I think I zoned this part of the discussion out, but I've written code like this: lineiter = iter(aFile) for line in lineiter: if sectionmarker in line: break parseSection1Line(line) for line in lineiter: if sectionmarker in line: break parseSection2Line(line) (though, not *quite* that regular...) This is, to me, neat and clear. I don't find the idea that iterators are tied to exactly 1 for loop an improvement (even though they usually will be). Cheers, mwh -- <thirmite> what's a web widget?? <glyph> thirmite: internet on a stick, on fire <Acapnotic> with web sauce! -- from Twisted.Quotes

Michael Hudson wrote:
This is, to me, neat and clear. I don't find the idea that iterators are tied to exactly 1 for loop an improvement (even though they usually will be).
To fix this in a fully backward-compatible way, we need some way of distinguishing generators that expect to be finalized. Suppose we leave the 'yield' statement alone, and introduce a new statement 'suspend', which alone has the new capabilities of (1) allowing injection of exceptions (2) ability to return a value (3) permissibility in a try-finally Doing throw() on a generator that is stopped at a yield statement would simply raise the exception without changing the state of the generator. So the for-loop could be made to finalize by default, and existing generators would be unaffected. A with-statement generator would then look like @with_template def foo(): initialize() try: suspend finally: finalize() which I think looks quite nice, because 'suspend' seems more suggestive of what is happening when you're not yielding a value. The same thing applies to coroutine-type applications. For partial iteration of new-style generators, there could be a new statement for var from expr: ... or maybe just a wrapper function for var in partial(expr): ... -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

On 5/19/05, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Michael Hudson wrote:
This is, to me, neat and clear. I don't find the idea that iterators are tied to exactly 1 for loop an improvement (even though they usually will be).
To fix this in a fully backward-compatible way, we need some way of distinguishing generators that expect to be finalized.
I don't see anything that needs to be "fixed" here. Sure, generators that expect to be finalised will not be finalised simply by the fact that a for loop exits, but that's fine - it's not part of the spec of a for loop that it does finalise the generator. Adding that guarantee to a for loop is a change in spec, not a fix. Paul.

At 04:44 PM 5/19/2005 +1200, Greg Ewing wrote:
Michael Hudson wrote:
This is, to me, neat and clear. I don't find the idea that iterators are tied to exactly 1 for loop an improvement (even though they usually will be).
To fix this in a fully backward-compatible way, we need some way of distinguishing generators that expect to be finalized.
No, we don't; Guido's existing proposal is quite sufficient for using yield. We dont' need to create another old/new distinction here.
participants (17)
-
Arnold deVos
-
Brett C.
-
Eric Nieuwland
-
Fredrik Lundh
-
Greg Ewing
-
Guido van Rossum
-
Jack Diederich
-
Josiah Carlson
-
Ka-Ping Yee
-
Michael Hudson
-
Nick Coghlan
-
Paul Moore
-
Phillip J. Eby
-
Ron Adam
-
Russell E. Owen
-
Shane Hathaway
-
Steven Bethard