Merging PEP 310 and PEP 340-redux?

Apologies if this has been discovered and rejected already; I've had to skip most of the discussions but this though won't leave my head... So PEP 310 proposes this: with VAR = EXPR: BLOCK translated to VAR = EXPR\ if hasattr(VAR, "__enter__"): VAR.__enter__() try: BLOCK finally: VAR.__exit__() This equates VAR with the value of EXPR. It has a problem: what if inside BLOCK an assignment to VAR is made -- does this affect the finally clause or not? I think that the finally clause should use an internal variable that isn't affected by assignments to VAR. But what if we changed the translation slightly so that VAR gets assigned to value of the __enter__() call: abc = EXPR VAR = abc.__enter__() # I don't see why it should be optional try: BLOCK finally: abc.__exit__() Now it would make more sense to change the syntax to with EXPR as VAR: BLOCK and we have Phillip Eby's proposal. The advantage of this is that you can write a relatively straightforward decorator, call it @with_template, that endows a generator with the __enter__ and __exit__ methods, so you can write all the examples (except auto_retry(), which was there mostly to make a point) from PEP 340 like this: @with_template def opening(filename, mode="r"): f = open(filename, mode) yield f f.close() and so on. (Note the absence of a try/finally block in the generator -- the try/finally is guaranteed by the with-statement but not by the generator framework.) I used to dislike this, but the opposition and the proliferation of alternative proposals have made me realize that I'd rather have this (plus "continue EXPR" which will be moved to a separate PEP once I have some extra time) than most of the other proposals. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum <gvanrossum@gmail.com> writes:
Apologies if this has been discovered and rejected already; I've had to skip most of the discussions but this though won't leave my head...
So PEP 310 proposes this:
with VAR = EXPR: BLOCK
translated to
VAR = EXPR\ if hasattr(VAR, "__enter__"): VAR.__enter__() try: BLOCK finally: VAR.__exit__()
This equates VAR with the value of EXPR. It has a problem: what if inside BLOCK an assignment to VAR is made -- does this affect the finally clause or not? I think that the finally clause should use an internal variable that isn't affected by assignments to VAR.
Uh, if that's not clear from the PEP (and I haven't looked) it's an oversight. VAR is optional in PEP 310, after all. Cheers, mwh -- There's an aura of unholy black magic about CLISP. It works, but I have no idea how it does it. I suspect there's a goat involved somewhere. -- Johann Hibschman, comp.lang.scheme

Guido van Rossum wrote:
Apologies if this has been discovered and rejected already; I've had to skip most of the discussions but this though won't leave my head...
So PEP 310 proposes this:
with VAR = EXPR: BLOCK
translated to
VAR = EXPR\ if hasattr(VAR, "__enter__"): VAR.__enter__() try: BLOCK finally: VAR.__exit__()
I used to dislike this, but the opposition and the proliferation of alternative proposals have made me realize that I'd rather have this (plus "continue EXPR" which will be moved to a separate PEP once I have some extra time) than most of the other proposals.
The User Defined Statement section of my PEP redraft suggests something very similar to this: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html It suggests more complex semantics, so that the statement template has the chance to intercept exceptions raised in the block, and can tell the difference between normal termination and exiting the block via break, continue or return statements. This is needed to support some of the use cases (like the transaction() template). All of the PEP 340 examples are written up at the end of the PEP redraft, along with some of the motivating cases for a non-looping construct. (Ignore the part in the redraft about for loops for the moment - Greg Ewing has convinced me that what I currently have gets the default behaviour backwards. And, in relation to that, the next version will require a decorator to enable __enter__() and __exit__() methods on a given generator). Regards, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Nick]
The User Defined Statement section of my PEP redraft suggests something very similar to this: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
It suggests more complex semantics, so that the statement template has the chance to intercept exceptions raised in the block, and can tell the difference between normal termination and exiting the block via break, continue or return statements. This is needed to support some of the use cases (like the transaction() template). All of the PEP 340 examples are written up at the end of the PEP redraft, along with some of the motivating cases for a non-looping construct.
Is that use case strong enough to require the added complexity? For a transactional wrapper, I can see that __exit__ needs to know about exceptions (though I'm unsure how much detail it needs), but what's the point of being able to tell an exception from a non-local goto (which break, continue and return really are)? I could see the following, minimal translation: oke = False abc = EXPR var = abc.__enter__() try: BLOCK oke = True finally: abc.__exit__(oke) What's your use case for giving __enter__ an opportunity to skip the block altogether? -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
[Nick] Is that use case strong enough to require the added complexity? For a transactional wrapper, I can see that __exit__ needs to know about exceptions (though I'm unsure how much detail it needs), but what's the point of being able to tell an exception from a non-local goto (which break, continue and return really are)?
The only real reason the statement template can tell the difference is because those non-local goto's all result in TerminateBlock being passed in as the exception (that's why the __exit__ method can't really tell the difference between those statements and the user code raising TerminateBlock, and also why TerminateBlock can't be suppressed by the statement template). As far as use cases go, any case where we want the statement template to be able to manipulate the exception handling requires that this information be passed in to the __exit__() method. The current examples given in the PEP are transaction() and auto_retry(), but others have been suggested over the course of the discussion. One suggestion was for automatically logging exceptions, which requires access to the full state of the current exception. I go into the motivation behind this a bit more in the updated draft I just posted (version 1.3). The basic idea is to allow factoring out of arbitrary try/except/else/finally code into a statement template, and use a 'do' statement to provide the contents of the 'try' clause. If the exception information isn't passed in, then we can really only factor out try/finally statements, which is far less interesting.
What's your use case for giving __enter__ an opportunity to skip the block altogether?
I realised that I don't have one - so the idea doesn't appear in the updated draft. Regards, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

On May 9, 2005, at 21:58, Guido van Rossum wrote:
Apologies if this has been discovered and rejected already; I've had to skip most of the discussions but this though won't leave my head...
Skimming rather than skipping all of the discussion burned most of my py-dev time, and it was just skimming, but I don't remember such rejections.
But what if we changed the translation slightly so that VAR gets assigned to value of the __enter__() call:
abc = EXPR VAR = abc.__enter__() # I don't see why it should be optional try: BLOCK finally: abc.__exit__()
Now it would make more sense to change the syntax to
with EXPR as VAR: BLOCK
and we have Phillip Eby's proposal. The advantage of this is that you
I like this. The only aspect of other proposals that I would sorely miss here, would be the inability for abc.__exit__ to deal with exceptions raised in BLOCK (or, even better, a separate specialmethod on abc called in lieu of __exit__ upon exceptions). Or am I missing something, and would this give a way within abc.__exit__ to examine and possibly ``unraise'' such an exception...?
can write a relatively straightforward decorator, call it @with_template, that endows a generator with the __enter__ and __exit__ methods, so you can write all the examples (except auto_retry(), which was there mostly to make a point) from PEP 340 like this:
@with_template def opening(filename, mode="r"): f = open(filename, mode) yield f f.close()
and so on. (Note the absence of a try/finally block in the generator -- the try/finally is guaranteed by the with-statement but not by the generator framework.)
I must be thick this morning, because this relatively straightforward decorator isn't immediately obvious to me -- care to show me how with_template gets coded? Alex

[Alex]
I like this. The only aspect of other proposals that I would sorely miss here, would be the inability for abc.__exit__ to deal with exceptions raised in BLOCK (or, even better, a separate specialmethod on abc called in lieu of __exit__ upon exceptions). Or am I missing something, and would this give a way within abc.__exit__ to examine and possibly ``unraise'' such an exception...?
See my followup to Nick. I'm not worried about unraising exceptions. The only way to mess with the exception from code in a finally-suite is to raise another exception, and we can't really prevent that. However (I forgot this in the response to Nick) unless/until we augment generators in some way the generator can't easily see the exception flag. [me]
can write a relatively straightforward decorator, call it @with_template, that endows a generator with the __enter__ and __exit__ methods, so you can write all the examples (except auto_retry(), which was there mostly to make a point) from PEP 340 like this:
@with_template def opening(filename, mode="r"): f = open(filename, mode) yield f f.close()
and so on. (Note the absence of a try/finally block in the generator -- the try/finally is guaranteed by the with-statement but not by the generator framework.)
[Alex]
I must be thick this morning, because this relatively straightforward decorator isn't immediately obvious to me -- care to show me how with_template gets coded?
Here's a sketch: class Wrapper(object): def __init__(self, gen): self.gen = gen self.state = "initial" def __enter__(self): assert self.state == "initial" self.state = "entered" try: return self.gen.next() except StopIteration: self.state = "error" raise RuntimeError("template generator didn't yield") def __exit__(self): assert self.state == "entered" self.state = "exited" try: self.gen.next() except StopIteration: return else: self.state = "error" raise RuntimeError("template generator didn't stop") def with_template(func): def helper(*args, **kwds): return Wrapper(func(*args, **kwds)) return helper @with_template def opening(filename, mode="r"): f = open(filename) # Note that IOError here is untouched by Wrapper yield f f.close() # Ditto for errors here (however unlikely) -- --Guido van Rossum (home page: http://www.python.org/~guido/)

At 07:57 AM 5/10/2005 -0700, Alex Martelli wrote:
On May 9, 2005, at 21:58, Guido van Rossum wrote:
But what if we changed the translation slightly so that VAR gets assigned to value of the __enter__() call:
abc = EXPR VAR = abc.__enter__() # I don't see why it should be optional try: BLOCK finally: abc.__exit__()
Now it would make more sense to change the syntax to
with EXPR as VAR: BLOCK
and we have Phillip Eby's proposal. The advantage of this is that you
I like this. The only aspect of other proposals that I would sorely miss here, would be the inability for abc.__exit__ to deal with exceptions raised in BLOCK (or, even better, a separate specialmethod on abc called in lieu of __exit__ upon exceptions). Or am I missing something, and would this give a way within abc.__exit__ to examine and possibly ``unraise'' such an exception...?
Yeah, I'd ideally like to see __try__, __except__, __else__, and __finally__ methods, matching the respective semantics of those clauses in a try/except/finally block.
can write a relatively straightforward decorator, call it @with_template, that endows a generator with the __enter__ and __exit__ methods, so you can write all the examples (except auto_retry(), which was there mostly to make a point) from PEP 340 like this:
@with_template def opening(filename, mode="r"): f = open(filename, mode) yield f f.close()
and so on. (Note the absence of a try/finally block in the generator -- the try/finally is guaranteed by the with-statement but not by the generator framework.)
I must be thick this morning, because this relatively straightforward decorator isn't immediately obvious to me -- care to show me how with_template gets coded?
Something like this, I guess: def with_template(f): class controller(object): def __init__(self,*args,**kw): self.iter = f(*args,**kw) def __enter__(self): return self.iter.next() def __exit__(self): self.iter.next() return controller But I'd rather see it with __try__/__except__ and passing exceptions into the generator so that the generator can use try/except/finally blocks to act on the control flow.

[Phillip J. Eby]
Yeah, I'd ideally like to see __try__, __except__, __else__, and __finally__ methods, matching the respective semantics of those clauses in a try/except/finally block.
What's your use case for adding this complexity? I'm going for simple here unless there's a really strong use case. Anyway, a wrapped generator wrapper can't do with all those distinctions unless we augment the generator somehow ("continue EXPR" would suffice). (Your decorator is equivalent to mine, but I don't like creating a new class each time.) -- --Guido van Rossum (home page: http://www.python.org/~guido/)

At 08:51 AM 5/10/2005 -0700, Guido van Rossum wrote:
[Phillip J. Eby]
Yeah, I'd ideally like to see __try__, __except__, __else__, and __finally__ methods, matching the respective semantics of those clauses in a try/except/finally block.
What's your use case for adding this complexity?
It makes it much easier to mentally translate a given try/except/else/finally usage to a "resource" or "block controller" or whatever it is we're calling these things now. You should almost be able to just stick 'def __' and '__(self):' and then munge the whole thing into a class. Of course, it's not *really* that simple, because __try__ doesn't exactly correspond to 'try:', and it has to return something, but it sure is simpler than the mental gymnastics I'd go through to convert except/else/finally into "if" statements inside an __exit__. Granted, if we end up with __enter__ and __exit__, I'll just write a resource mixin class whose __exit__ calls a stubbed-out __except__, __else__, and __finally__. Then I won't have to figure out how to write __exit__ methods all the time. Which is great for me, but I was thinking that this interface would reduce complexity for people trying to learn how to write these things. I wasn't advocating this before because PEP 340's use of generators allowed you to directly use try/except/else/finally. But, the new approach seems targeted at a wider set of use cases that don't include generators. IOW, it's likely you'll be adding resource-finalization methods to actual resource classes, and grafting generators into them to implement __enter__/__exit__ seems more complex at first glance than just letting people add the methods directly; e.g.: def __enter__(self): self.handler = self._resource_control() return self.handler.__enter__() def __exit__(self): self.handler.__exit__() @with_template def _resource_control(self): f = self.open("blah") try: yield f finally: f.close() versus this rather more "obvious way" to do it: def __try__(self): self.f = self.open("blah") return self.f def __finally__(self): self.f.close() But I suppose you could encapsulate either pattern as a mixin class, so I suppose this could be treated as a matter for examples in documentation rather than as an implementation aspect. It's just that if __exit__ has to probe exception state or other wizardry, it's going to be harder for non-wizards to use, and that's what I was reacting to here. Anyway, I see now that documentation and simple mixins could address it, so if you think it's best handled that way, so be it.
I'm going for simple here unless there's a really strong use case. Anyway, a wrapped generator wrapper can't do with all those distinctions unless we augment the generator somehow ("continue EXPR" would suffice).
You'd use only __try__, __except__, and __else__ to wrap a generator. For some other use cases you'd only use __try__ and __finally__, or __try__ and __except__, or __try__ and __else__. I don't know of any use cases where you'd want to use all four simultaneously on the same controller.
(Your decorator is equivalent to mine, but I don't like creating a new class each time.)
Mine was just a sketch to show the idea anyway; I'd be surprised if it doesn't have at least one bug.

Phillip J. Eby wrote:
Of course, it's not *really* that simple, because __try__ doesn't exactly correspond to 'try:', and it has to return something, but it sure is simpler than the mental gymnastics I'd go through to convert except/else/finally into "if" statements inside an __exit__.
You don't need to make that translation, though. Instead, you can just reraise the passed in exception inside the __exit__() method: def __exit__(self, *exc_info): try: try: if exc_info: raise exc_info[0], exc_info[1], exc_info[2] except: pass else: pass finally: pass However, the __exit__() method does allow you to switch to using if statements if that makes more sense (or would be more efficient). For instance, these are possible __exit__ methods for a locking() statement template and a transaction() statement template: # locking's exit method def __exit__(self, *exc_info): self.lock.release() if exc_info: raise exc_info[0], exc_info[1], exc_info[2] # transaction's exit method def __exit__(self, *exc_info): if exc_info: self.db.rollback() raise exc_info[0], exc_info[1], exc_info[2] else: self.db.commit() I've posted draft 1.4 of my PEP 310/PEP 340 merger PEP (PEP 650, maybe?): http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html This version cleans up the semantics a lot, so that the examples actually work as intended, and there is One Obvious Way to do things like suppressing exceptions (i.e. don't reraise them in the __exit__() method). It also specifically addresses the question of using two methods in the protocol versus four, and shows how an arbitrary try statement can be converted to a statement template. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

On 5/11/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
I've posted draft 1.4 of my PEP 310/PEP 340 merger PEP (PEP 650, maybe?): http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
I've been skipping the discussion, but this is starting to look pretty good. I'll give it a proper read soon. However, one thing immediately struck me: if __exit__ gets an exception and does not re-raise it, it is silently ignored. This seems like a bad idea - the usual "errors should not pass silently" applies. I can very easily imagine statement templates accidentally eating KeyboardInterrupt or SystemExit exceptions. At the very least, there should be a section in "rejected alternatives" explaining why it is not appropriate to force reraising of exceptions unless explicit action is taken. There could be good reasons (as I say, I haven't followed the discussion) but they should be recorded. And if there aren't any good reasons, this behaviour should probably be changed. Paul. PS Apologies if I missed the discussion of this in the PEP - as I say, I've only skimmed it so far.

Paul Moore wrote:
At the very least, there should be a section in "rejected alternatives" explaining why it is not appropriate to force reraising of exceptions unless explicit action is taken. There could be good reasons (as I say, I haven't followed the discussion) but they should be recorded. And if there aren't any good reasons, this behaviour should probably be changed.
This is a good point - it's one of the things I changed in the simplification of the semantics between 1.3 and 1.4 (previously the behaviour was much as you describe). The gist is that the alternative is to require an __exit__() method to raise TerminateBlock in order to suppress an exception. Then the call to __exit__() in the except clause needs to be wrapped in a try/except TerminateBlock/else, with the else reraising the exception, and the except clause suppressing it. The try/except around BLOCK1 has to have a clause added to reraise TerminateBlock so that it isn't inadvertently suppressed when it is raised by user code (although that may be a feature, not a bug! - I'll have to think about that one) Basically, it's possible to set it up that way, but it was a little ugly and hard to explain. "It's suppressed if you don't reraise it" is very simple, but makes it easier (too easy?) to inadvertently suppress exceptions. If people are comfortable with a little extra ugliness in the semantic details of 'do' statements in order to make it easier to write correct __exit__() methods, then I'm happy to change it back (I think I just talked myself into doing that, actually). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

On 5/11/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
The gist is that the alternative is to require an __exit__() method to raise TerminateBlock in order to suppress an exception.
So I didn't see any examples that really needed TerminateBlock to suppress an exception. If the semantics of a do-statement was simply: stmt = EXPR1 try: stmt_enter = stmt.__enter__ stmt_exit = stmt.__exit__ except AttributeError: raise TypeError("User defined statement template required") VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause exc = () try: try: BLOCK1 except: exc = sys.exc_info() finally: stmt_exit(*exc) would this make any of the examples impossible to write? All you have to do to suppress an exception is to not reraise it in __exit__. These semantics would make a normally completed BLOCK1 look like a BLOCK1 exited through return, break or continue, but do we have any use cases that actually need this distinction? I couldn't see any, but I've been reading *way* too many PEP 310/340 posts so probably my eyes are just tired. ;-) If you want the default to be that the exception gets re-raised (instead of being suppressed as it is above), I think you could just change the finally block to something like: finally: if stmt_exit(*exc): raise exc[0], exc[1], exc[2] That would mean that if any nonzero object was returned from __exit__, the exception would be reraised. But, like I say, I've been reading this thread way too much, so I'm probably just missing the TerminateBlock use cases. =) STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

On 5/11/05, Steven Bethard <steven.bethard@gmail.com> wrote:
If you want the default to be that the exception gets re-raised (instead of being suppressed as it is above), I think you could just change the finally block to something like:
finally: if stmt_exit(*exc): raise exc[0], exc[1], exc[2]
That would mean that if any nonzero object was returned from __exit__, the exception would be reraised.
Oops. This should have been: finally: if not stmt_exit(*exc): raise exc[0], exc[1], exc[2] This would mean that if a function returned None (or any other "False" object) the exception would be reraised. Suppressing the reraising of the exception would require returning a nonzero object. STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

[Steven Bethard]
So I didn't see any examples that really needed TerminateBlock to suppress an exception. If the semantics of a do-statement was simply:
stmt = EXPR1 try: stmt_enter = stmt.__enter__ stmt_exit = stmt.__exit__ except AttributeError: raise TypeError("User defined statement template required")
VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause exc = () try: try: BLOCK1 except: exc = sys.exc_info() finally: stmt_exit(*exc)
would this make any of the examples impossible to write? All you have to do to suppress an exception is to not reraise it in __exit__.
But this use case would contain a trap for the unwary user who is writing an __exit__ method -- they have to remember to re-raise an exception if it was passed in, but that's easy to forget (and slightly tricky since you have to check the arg count or whether the first argument is not None). Going for all-out simplicity, I would like to be able to write these examples: class locking: def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self, *args): self.lock.release() class opening: def __init__(self, filename): self.filename = filename def __enter__(self): self.f = open(self.filename); return self.f def __exit__(self, *args): self.f.close()\ And do EXPR as VAR: BLOCK would mentally be translated into itr = EXPR VAR = itr.__enter__() try: BLOCK finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't defined by finally -- --Guido van Rossum (home page: http://www.python.org/~guido/)

I think this is my first post to python-dev, so a mini-intro: I've been lurking here for about 5 years, "professional" user a bit longer, now working at Zope Corp. Guido van Rossum wrote:
Going for all-out simplicity, I would like to be able to write these examples:
class locking: def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self, *args): self.lock.release()
class opening: def __init__(self, filename): self.filename = filename def __enter__(self): self.f = open(self.filename); return self.f def __exit__(self, *args): self.f.close()
I've read the entire discussion, but may have missed this point, so, correct me if I'm wrong. Wouldn't these semantics let "normal" objects be used in a do. For example, if the file object had these methods: def __enter__(self): return self def __exit__(self, *args): self.close() you could write do file('whatever) as f: lines = f.readlines() Or a lock: def __enter__(self): self.aquire(); return self def __exit__(self, *args): self.release() do my_lock: a() b() c() -- Benji York Sr. Software Engineer Zope Corporation

On 5/12/05, Benji York <benji@zope.com> wrote:
if the file object had these methods:
def __enter__(self): return self def __exit__(self, *args): self.close()
you could write
do file('whatever) as f: lines = f.readlines()
Or a lock:
def __enter__(self): self.aquire(); return self def __exit__(self, *args): self.release()
do my_lock: a() b() c()
Ah, finally a proposal that I can understand! But maybe the keyword should be "let": let lock: do_something let open("myfile") as f: for line in f: do_something(line) or even, without need of "as": let f=file("myfile") : for line in f: do_something(line) which I actually like more Michele Simionato

Michele Simionato wrote:
let lock: do_something
let open("myfile") as f: for line in f: do_something(line)
This is getting even further into the realm of gibberish to my ear.
let f=file("myfile") : for line in f: do_something(line)
To anyone with a Lisp or funcional background, that looks like nothing more than a local variable binding. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

I just read Raymond Chen's rant against control flow macros: http://blogs.msdn.com/oldnewthing/archive/2005/01/06/347666.aspx I think this pretty much kills PEP 340, as well as Nick Coghlan's alternative: both proposals let you write a "template" that can be used to hide exception-catching code, which is a form of control flow (and a pretty important one if you read Chen's rant against exceptions referenced by the former, even if you don't agree with everything he says in the latter). Which leaves us, IMO, with the choice between PEP 310 and my own "PEP-340-redux" proposal; these *only* introduce a finally-clause, which does not affect the control flow. I'm not counting exceptions that might happen in the finally-clause; exceptions can happen anywhere anyway. But I am counting the *catching* of an exception as control flow, since that means that code past BLOCK (in the same function) is reachable even if BLOCK was not executed to completion; and this is the argument against PEP 340 and against Nick's alternative. Let's compare and contrast the two remaining competitors: PEP 310 ======= Syntax: with EXPR [= VAR]: BLOCK Translation: [VAR =] abc = EXPR if hasattr(abc, "__enter__"): abc.__enter__() try: BLOCK finally: abc.__exit__() Pros: - dead simple Cons: - can't use a decorated generator for EXPR PEP 340 redux ============= Syntax: do EXPR [as VAR]: BLOCK Translation: abc = EXPR [VAR =] abc.__enter__() try: BLOCK finally: abc.__exit__(*"sys.exc_info()") # Not exactly Pros: - can use a decorated generator as EXPR - separation of EXPR and VAR (VAR gets what EXPR.__enter__() returns) Cons: - slightly less simple (__enter__ must return something for VAR; __exit__ takes optional args) Everything else is equal or can be made equal. We can make them more equal by treating the arguments passed to __exit__() as a separate decision, and waffling about whether __enter__() should be optional (I think it's a bad idea even for PEP 310; it *could* be made optional for PEP 340 redux). Let's also not quibble about the keyword used; again, that can be a separate decision. Note that only PEP 310 can use the "VAR = EXPR" syntax; PEP 340 redux *must* use "EXPR as VAR" since it doesn't assign the value of EXPR to VAR; PEP 310 can be rewritten using this syntax as well. So then the all-important question I want to pose is: do we like the idea of using a (degenerate, decorated) generator as a "template" for the do-statement enough to accept the slightly increased complexity? The added complexity is caused by the need to separate VAR from EXPR so that a generator can be used. I personally like this separation; I actually like that the "anonymous block controller" is logically separate from the variable bound by the construct. From Greg Ewing's response to the proposal to endow file objects with __enter__ and __exit__ methods, I believe he thinks so too. Straight up-or-down votes in the full senate are appreciated at this point. On to the secondary questions: - Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB - I have a more elaborate proposal for __exit__'s arguments. Let the translation be as follows: abc = EXPR [VAR =] abc.__enter__() oke = False # Pronounced "okay" exc = () try: try: BLOCK oke = True except: exc = sys.exc_info() raise finally: abc.__exit__(oke, *exc) This means that __exit__ can be called with the following arguments: abc.__exit__(True) - normal completion of BLOCK abc.__exit__(False) - BLOCK was left by a non-local goto (break/continue/return) abc.__exit__(False, t, v, tb) - BLOCK was left by an exception (An alternative would be to always call it with 4 arguments, the last three being None in the first two cases.) If we adopt PEP 340 redux, it's up to the decorator for degenerate generators to decide how to pass this information into the generator; if we adopt PEP 342 ("continue EXPR") at the same time, we can let the yield-expression return a 4-tuple (oke, t, v, tb). Most templates can ignore this information (so they can just use a yield-statement). PS. I've come up with another interesting use case: block signals for the duration of a block. This could be a function in the signal module, e.g. signal.blocking([ist of signals to block]). The list would default to all signals. Similar signal.ignoring(). -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum <gvanrossum@gmail.com> writes:
I just read Raymond Chen's rant against control flow macros: http://blogs.msdn.com/oldnewthing/archive/2005/01/06/347666.aspx
I think this pretty much kills PEP 340, as well as Nick Coghlan's alternative: both proposals let you write a "template" that can be used to hide exception-catching code, which is a form of control flow (and a pretty important one if you read Chen's rant against exceptions referenced by the former, even if you don't agree with everything he says in the latter).
Well, I'm not sure what the content of the latter article is, other than "getting things right can be hard". BTW, the "else:" on try statements is so very handy for getting this sort of thing (more) corrent.
Which leaves us, IMO, with the choice between PEP 310 and my own "PEP-340-redux" proposal; these *only* introduce a finally-clause, which does not affect the control flow. I'm not counting exceptions that might happen in the finally-clause; exceptions can happen anywhere anyway. But I am counting the *catching* of an exception as control flow, since that means that code past BLOCK (in the same function) is reachable even if BLOCK was not executed to completion; and this is the argument against PEP 340 and against Nick's alternative.
Let's compare and contrast the two remaining competitors:
PEP 310 =======
Syntax: with EXPR [= VAR]: BLOCK
Translation: [VAR =] abc = EXPR if hasattr(abc, "__enter__"): abc.__enter__() try: BLOCK finally: abc.__exit__()
Pros: - dead simple
Cons: - can't use a decorated generator for EXPR
Sorry, why not? [note: I work this out, below, but I still think the code is worth posting] import sys class BlockTemplate(object): def __init__(self, g, args, kw): self.g = g self.args = args self.kw = kw def __enter__(self): self.giter = self.g(*self.args) self.giter.next() def __exit__(self): try: self.giter.next() except StopIteration: pass else: raise RuntimeError, "generator not exhausted" def template(g): def _(*args, **kw): return BlockTemplate(g, args, kw) return _ @template def redirected_stdout(out): print 'hi' save_stdout = sys.stdout sys.stdout = out yield None sys.stdout = save_stdout print 'ho' ## with redirected_stdout(fileobj): ## print 1 output = open("foo", "w") abc = redirected_stdout(output) abc.__enter__() try: print 1 finally: abc.__exit__() output.close() print repr(open("foo").read()) (this was a bit harder to get right than I expected, mind). Oh, I guess the point is that with a decorated generator you can yield a value to be used as VAR, rather than just discarding the value as here. Hmm.
PEP 340 redux =============
Syntax: do EXPR [as VAR]: BLOCK
Translation: abc = EXPR [VAR =] abc.__enter__() try: BLOCK finally: abc.__exit__(*"sys.exc_info()") # Not exactly
These two expansions look very similar to me. What am I missing?
Pros: - can use a decorated generator as EXPR - separation of EXPR and VAR (VAR gets what EXPR.__enter__() returns)
Oh! Hmm. This is a bit subtle. I guess I should think about some examples.
Cons: - slightly less simple (__enter__ must return something for VAR; __exit__ takes optional args)
If things were fiddled such that sys.exc_info() return non-Nones when a finally clause is being executed because of an exception, we don't really need this wart, do we?
Everything else is equal or can be made equal. We can make them more equal by treating the arguments passed to __exit__() as a separate decision, and waffling about whether __enter__() should be optional (I think it's a bad idea even for PEP 310; it *could* be made optional for PEP 340 redux).
I don't really recall why it's optional in PEP 310.
Let's also not quibble about the keyword used; again, that can be a separate decision. Note that only PEP 310 can use the "VAR = EXPR" syntax; PEP 340 redux *must* use "EXPR as VAR" since it doesn't assign the value of EXPR to VAR; PEP 310 can be rewritten using this syntax as well.
So then the all-important question I want to pose is: do we like the idea of using a (degenerate, decorated) generator as a "template" for the do-statement enough to accept the slightly increased complexity?
Looking at my above code, no (even though I think I've rendered the point moot...). Compare and contrast: @template def redirected_stdout(out): save_stdout = sys.stdout sys.stdout = out yield None sys.stdout = save_stdout class redirected_stdout(object): def __init__(self, output): self.output = output def __enter__(self): self.save_stdout = sys.stdout sys.stdout = self.output def __exit__(self): sys.stdout = self.save_stdout The former is shorter and contains less (well, no) 'self.'s, but I think I find the latter somewhat clearer.
The added complexity is caused by the need to separate VAR from EXPR so that a generator can be used. I personally like this separation; I actually like that the "anonymous block controller" is logically separate from the variable bound by the construct.
Nevertheless, I think I actually like this argument!
From Greg Ewing's response to the proposal to endow file objects with __enter__ and __exit__ methods, I believe he thinks so too.
Straight up-or-down votes in the full senate are appreciated at this point.
+1 for the PEP 340 variant.
On to the secondary questions:
- Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB
No opinion.
- I have a more elaborate proposal for __exit__'s arguments. Let the translation be as follows:
abc = EXPR [VAR =] abc.__enter__() oke = False # Pronounced "okay" exc = () try: try: BLOCK oke = True except: exc = sys.exc_info() raise finally: abc.__exit__(oke, *exc)
-"a bit"
PS. I've come up with another interesting use case: block signals for the duration of a block. This could be a function in the signal module, e.g. signal.blocking([ist of signals to block]). The list would default to all signals. Similar signal.ignoring().
First you need to hit the authors of various libcs with big sticks. Cheers, mwh -- <shapr> ucking keyoar -- from Twisted.Quotes

[Michael Hudson, after much thinking aloud]
Oh, I guess the point is that with a decorated generator you can yield a value to be used as VAR, rather than just discarding the value as here. Hmm.
Right. (I thought it was worth quoting this for the benefit of other who went down the same trail but didn't quite make it to this destination.)
If things were fiddled such that sys.exc_info() return non-Nones when a finally clause is being executed because of an exception, we don't really need this wart, do we?
The problem is that sys.exc_info() almost always returns *something* -- it's usually the last exception that was *ever* caught, except in certain circumstances. Phillip wrote on the same issue:
I'm not sure the extra argument is a good idea; doesn't this introduce the same sort of invisible control flow as swallowing exceptions? Also, since the block controller can't actually change the control flow, I'm having a hard time thinking of any actual use cases for this information.
The 'oke' argument is so that the author of transactional() can decide what to do with a non-local goto: commit, rollback or hit the author over the head with a big stick. [Michael again]
Compare and contrast:
@template def redirected_stdout(out): save_stdout = sys.stdout sys.stdout = out
yield None
sys.stdout = save_stdout
class redirected_stdout(object):
def __init__(self, output): self.output = output
def __enter__(self): self.save_stdout = sys.stdout sys.stdout = self.output
def __exit__(self): sys.stdout = self.save_stdout
The former is shorter and contains less (well, no) 'self.'s, but I think I find the latter somewhat clearer.
Tastes differ. I think the generator wins; more so when there's more state to remember. [Michael quoting Guido]
The added complexity is caused by the need to separate VAR from EXPR so that a generator can be used. I personally like this separation; I actually like that the "anonymous block controller" is logically separate from the variable bound by the construct.
Nevertheless, I think I actually like this argument!
(Repeated for the benefit of others.)
Straight up-or-down votes in the full senate are appreciated at this point.
+1 for the PEP 340 variant.
-- --Guido van Rossum (home page: http://www.python.org/~guido/)

On 5/13/05, Guido van Rossum <gvanrossum@gmail.com> wrote:
Tastes differ. I think the generator wins; more so when there's more state to remember. [...]
Straight up-or-down votes in the full senate are appreciated at this point.
+1 for the PEP 340 variant.
I am also +1 for the PEP 340 variant. I can see the value in generators when state management starts to become more complex. No significant opinion on choice of keyword. I don't follow the subtleties for the more elaborate __exit__, so I'll pass on that one as well. Paul.

At 08:41 AM 5/13/2005 -0700, Guido van Rossum wrote:
The 'oke' argument is so that the author of transactional() can decide what to do with a non-local goto: commit, rollback or hit the author over the head with a big stick.
Since this is just a replacement for a try/except/finally block, I'd expect that in a transactional case that a non-local goto would work the same as any other non-exception exit. ISTM that the resource block use cases are: * Save the current state of something, modify it, and then restore the old state once the block completes (try/finally, used for locking, redirection, signals, decimal context, etc.) * Automatically roll back partially-done work in case of exception, and/or "roll forward" completed work (try/except/else, used for "transaction" scenarios) * Release allocated resource(s) after use (try/finally, used to close files and suchlike) None of these, AFAICT, benefit from differing behavior in the presence of nonlinear (but non-exceptional) control flow. It just seems too magical to me in the new context. When we were talking about a "block" construct or user-defined syntax, it made more sense because you could actually redefine the *meaning* of those constructs to some extent -- and because the *target* of the break and continue statements at least was the block itself, not some containing block.

[Guido]
The 'oke' argument is so that the author of transactional() can decide what to do with a non-local goto: commit, rollback or hit the author over the head with a big stick.
[Phillip J. Eby]
Since this is just a replacement for a try/except/finally block, I'd expect that in a transactional case that a non-local goto would work the same as any other non-exception exit.
ISTM that the resource block use cases are:
* Save the current state of something, modify it, and then restore the old state once the block completes (try/finally, used for locking, redirection, signals, decimal context, etc.)
* Automatically roll back partially-done work in case of exception, and/or "roll forward" completed work (try/except/else, used for "transaction" scenarios)
* Release allocated resource(s) after use (try/finally, used to close files and suchlike)
None of these, AFAICT, benefit from differing behavior in the presence of nonlinear (but non-exceptional) control flow. It just seems too magical to me in the new context. When we were talking about a "block" construct or user-defined syntax, it made more sense because you could actually redefine the *meaning* of those constructs to some extent -- and because the *target* of the break and continue statements at least was the block itself, not some containing block.
That works for me; I was just hypothesizing about the needs of others, but personally I'm fine with not knowing. I guess part of my motivation is also that this information is readily available intenally when a finally-clause is executed, since when the clause completes a pending non-local goto has to be resumed. But there's no reason to expose *all* internal state information... So the signature of __exit__ is just what sys.exc_info() returns. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Phillip J. Eby wrote: > At 08:41 AM 5/13/2005 -0700, Guido van Rossum wrote: > >>The 'oke' argument is so that the author of transactional() can decide >>what to do with a non-local goto: commit, rollback or hit the author >>over the head with a big stick. <snip> > * Automatically roll back partially-done work in case of exception, and/or > "roll forward" completed work (try/except/else, used for "transaction" > scenarios) Doing transactions with try/except/else is not quite correct, since using any of the three non-local goto's actually executes neither the commit nor the rollback (of course, this is where Guido's stick comment comes into play. . .). However, I'm fine with declaring that, from the perspective of a statement template, 'return', 'break' and 'continue' are all 'non-exceptional exits', and so templates like transaction() are expected to treat them as such. Picking one way and enforcing it by restricting the information seen by __exit__() also seems to be a much better option than allowing the possibility of: do bobs.transaction(): break # Triggers a rollback! do alices.transaction(): break # Triggers a commit! Going the 'non-exceptional exits' route also saves inventing a pseudo-exception to stand in for the 3 non-local goto statements (such a pseudo-exception would recreate the above behavioural hole, anyway). An exceptional exit can be forced if a non-local goto needs to be executed in response to a failure: class AbortDo(Exception): pass do alices.transaction(): break # Triggers a commit (non-exceptional exit) try: do alices.transaction(): raise AbortDo # Triggers a rollback (exceptional exit) except AbortDo: break Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Nick Coghlan]
However, I'm fine with declaring that, from the perspective of a statement template, 'return', 'break' and 'continue' are all 'non-exceptional exits', and so templates like transaction() are expected to treat them as such.
Me too. The argument that made me realize this is okay came after reading Raymond Chen's rant about control-flow macros: when you catch an exception, you don't know how much of the try-block was executed successfully, and neither does the author of that block; but when you "catch" a non-local goto, you must assume that the block's author knows what they are doing, so at least *they* know exactly which code was executed (everything up to the break/continue/return) and which wasn't. So the argument about rolling back indeterminate results doesn't hold. If they want the transaction to fail, they should raise an exception. I really need to start writing PEP 343 to capture this particular solution more carefully. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum <gvanrossum@gmail.com> writes:
[Michael Hudson, after much thinking aloud]
Yeah, sorry about that :)
Oh, I guess the point is that with a decorated generator you can yield a value to be used as VAR, rather than just discarding the value as here. Hmm.
Right. (I thought it was worth quoting this for the benefit of other who went down the same trail but didn't quite make it to this destination.)
If things were fiddled such that sys.exc_info() return non-Nones when a finally clause is being executed because of an exception, we don't really need this wart, do we?
The problem is that sys.exc_info() almost always returns *something* -- it's usually the last exception that was *ever* caught, except in certain circumstances.
Yeah, OK. I'll stop claiming to understand sys.exc_info() apart from the simple cases.
[Michael again]
Compare and contrast:
@template def redirected_stdout(out): save_stdout = sys.stdout sys.stdout = out
yield None
sys.stdout = save_stdout
class redirected_stdout(object):
def __init__(self, output): self.output = output
def __enter__(self): self.save_stdout = sys.stdout sys.stdout = self.output
def __exit__(self): sys.stdout = self.save_stdout
The former is shorter and contains less (well, no) 'self.'s, but I think I find the latter somewhat clearer.
Tastes differ. I think the generator wins; more so when there's more state to remember.
Certainly when there's more state to manage, yes. But both will be possible, so, *shrug*. It's not a big deal.
[Michael quoting Guido]
The added complexity is caused by the need to separate VAR from EXPR so that a generator can be used. I personally like this separation; I actually like that the "anonymous block controller" is logically separate from the variable bound by the construct.
Nevertheless, I think I actually like this argument!
(Repeated for the benefit of others.)
I guess this means PEP 310 can be retracted. Finally, from PEP 343 rev 1.7: exc = () # Or (None, None, None) ? The latter, please. Cheers, mwh -- . <- the point your article -> . |------------------------- a long way ------------------------| -- Christophe Rhodes, ucam.chat

Michael Hudson wrote:
Looking at my above code, no (even though I think I've rendered the point moot...). Compare and contrast:
@template def redirected_stdout(out): save_stdout = sys.stdout sys.stdout = out
yield None
sys.stdout = save_stdout
class redirected_stdout(object):
def __init__(self, output): self.output = output
def __enter__(self): self.save_stdout = sys.stdout sys.stdout = self.output
def __exit__(self): sys.stdout = self.save_stdout
The former is shorter and contains less (well, no) 'self.'s, but I think I find the latter somewhat clearer.
the same argument could be used (and was probably used) against generators: why not just use __getitem__ and instance state? as soon as you write something longer than four lines, using more than one state variable, you'll find that generator-based code is a lot more readable. </F>

At 03:05 AM 5/13/2005 -0700, Guido van Rossum wrote:
So then the all-important question I want to pose is: do we like the idea of using a (degenerate, decorated) generator as a "template" for the do-statement enough to accept the slightly increased complexity?
Since the "do protocol" is now distinct from the iterator protocol, I don't believe a decorator is still required. The purpose of the decorator was to help reduce confusion between the block statement and a "for" loop. Since you can no longer swallow exceptions, there is no downside to using an existing generator as the target of a "do" statement. That is, the concern about generators catching StopIteration from a "yield" doesn't matter, as they will simply step through to their next yield statement, and then the original exception will propagate.
The added complexity is caused by the need to separate VAR from EXPR so that a generator can be used. I personally like this separation; I actually like that the "anonymous block controller" is logically separate from the variable bound by the construct. From Greg Ewing's response to the proposal to endow file objects with __enter__ and __exit__ methods, I believe he thinks so too.
Straight up-or-down votes in the full senate are appreciated at this point.
+1.
On to the secondary questions:
- Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB
+1 on "do EXPR as VAR", where VAR may be any valid LHS of an assignment.
This means that __exit__ can be called with the following arguments:
abc.__exit__(True) - normal completion of BLOCK
abc.__exit__(False) - BLOCK was left by a non-local goto (break/continue/return)
abc.__exit__(False, t, v, tb) - BLOCK was left by an exception
(An alternative would be to always call it with 4 arguments, the last three being None in the first two cases.)
I'm not sure the extra argument is a good idea; doesn't this introduce the same sort of invisible control flow as swallowing exceptions? Also, since the block controller can't actually change the control flow, I'm having a hard time thinking of any actual use cases for this information.
If we adopt PEP 340 redux, it's up to the decorator for degenerate generators to decide how to pass this information into the generator; if we adopt PEP 342 ("continue EXPR") at the same time, we can let the yield-expression return a 4-tuple (oke, t, v, tb). Most templates can ignore this information (so they can just use a yield-statement).
I was going to propose having a generator-iterator's __exit__() raise the triple inside the generator, or raise StopIteration inside the generator if there is no triple. I'd ideally also like close() as a synonym for __exit__() with no arguments. Although these are properly the subject of PEPs 288 and 325 respectively, I felt this would elegantly bring them both to closure. However, after thinking it through, I realized that I don't see any obvious way to make __exit__ reusable for PEPs 288 and 325, because for 288 at least, I'd want __exit__ to either return the next yielded value or raise StopIteration. But, this isn't compatible with the "do protocol"'s needs, unless the "do protocol" suppressed StopIteration, and that doesn't seem like such a good idea. It seems to me that passing exceptions into a generator does in fact require a distinct method. But, I still do believe that generator-iterators can have predefined __enter__ and __exit__ methods without the need for a decorator.
PS. I've come up with another interesting use case: block signals for the duration of a block. This could be a function in the signal module, e.g. signal.blocking([ist of signals to block]). The list would default to all signals. Similar signal.ignoring().
Sweet. You could also use it for temporary signal handling, i.e. "set this signal handler for the duration of the block". Sort of the same class of "make sure I restore a global I'm tampering with" use case as redirecting stdout, and Tim Peters' Decimal context use cases.

Here's my vote on things at the moment: +1 on do EXPR as VAR: ... +1 on keeping the EXPR and VAR distinct. +1 on keeping the do and generator protocols distinct. +1 on not going out of our way to let the controller catch exceptions or alter control flow. Let's keep it as simple as we can. -0.7 on directly giving generators do-protocol methods. I'm not yet convinced that encouraging people to use generators to implement block controllers is a good idea. If we blur the distinction too much at this stage, we may regret it later if we come up with a better idea. Also I don't see that people will be writing block controllers anywhere near as often as iterators, so writing classes for them isn't going to be a big chore. And people can always use a do-protocol-to-generator adaptor if they really want. Greg

[Greg Ewing]
-0.7 on directly giving generators do-protocol methods.
I'm -1 on this myself.
I'm not yet convinced that encouraging people to use generators to implement block controllers is a good idea. If we blur the distinction too much at this stage, we may regret it later if we come up with a better idea. Also I don't see that people will be writing block controllers anywhere near as often as iterators, so writing classes for them isn't going to be a big chore. And people can always use a do-protocol-to-generator adaptor if they really want.
Right. I'm +0 on adding a standard module defining a do_template decorator that turns a degenerate generator into a do-statement controller. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

On 5/13/05, Guido van Rossum <gvanrossum@gmail.com> wrote:
So then the all-important question I want to pose is: do we like the idea of using a (degenerate, decorated) generator as a "template" for the do-statement enough to accept the slightly increased complexity?
+0. I'm not thoroughly convinced that generators are that much easier to read than a class. But I don't find them hard to read, and I think it would only take a little effort to learn that generators might not always be intended to build iterators. If we do support generators in do-statements, I'd like their __enter__() and __exit__() methods (if possible) to have semantics like Nick Coghlan suggested[1], so that: * __enter__() raises an exception if next() has already been called, and * __exit__() raises an exception if StopIteration is not raised The first makes sure that the generator is only used once, and the second makes sure that there is only one yield on the given control path through the generator. In all but the most sick and twisted code, raising exceptions like this will be identifying errors in how the generator was written.
Straight up-or-down votes in the full senate are appreciated at this point.
+1 on the PEP 340 redux semantics.
On to the secondary questions:
- Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB
+1 on using 'do'.
- I have a more elaborate proposal for __exit__'s arguments. Let the translation be as follows: [snip] abc.__exit__(True) - normal completion of BLOCK
abc.__exit__(False) - BLOCK was left by a non-local goto (break/continue/return)
abc.__exit__(False, t, v, tb) - BLOCK was left by an exception
-1. This looks like a fair bit of added complexity for not much gain. The only example that even *might* make use of this was the transactional one, and I haven't yet seen a use case where it actually *is*. The simpler semantics give you the difference between a normal exit and an exceptional exit. I'd like to see an example that needs to know the difference between block completion exit and a break/continue/return exit before I'd want to make PEP 340 redux this much more complex. STeVe [1] http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

[Steven Bethard]
+0. I'm not thoroughly convinced that generators are that much easier to read than a class. But I don't find them hard to read, and I think it would only take a little effort to learn that generators might not always be intended to build iterators.
I am proposing (like Phillip Eby in his response to PEP 340) to use a special decorator that turns a generator into a "do-template", so the intention is evident from the generator declaration.
If we do support generators in do-statements, I'd like their __enter__() and __exit__() methods (if possible) to have semantics like Nick Coghlan suggested[1], so that: * __enter__() raises an exception if next() has already been called, and * __exit__() raises an exception if StopIteration is not raised
I guess you missed my post where I gave the code for the decorator; it does exactly that.
The simpler semantics give you the difference between a normal exit and an exceptional exit. I'd like to see an example that needs to know the difference between block completion exit and a break/continue/return exit before I'd want to make PEP 340 redux this much more complex.
I agreed to that in my response to Phillip Eby. I do want to pass the exception into __exit__ so that it can be logged, for example. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
I just read Raymond Chen's rant against control flow macros: http://blogs.msdn.com/oldnewthing/archive/2005/01/06/347666.aspx
I think this pretty much kills PEP 340, as well as Nick Coghlan's alternative: both proposals let you write a "template" that can be used to hide exception-catching code, which is a form of control flow (and a pretty important one if you read Chen's rant against exceptions referenced by the former, even if you don't agree with everything he says in the latter).
It seems the effect of Raymond's articles on you was similar to their effect on me :)
Straight up-or-down votes in the full senate are appreciated at this point.
PEP 340 redux for me (as you might have guessed) - I think the transaction() use case is a genuinely useful one. The ability to access the return value of __enter__() is also more useful than simply duplicating what could be achieved by an assignment on the line before the user defined statement.
On to the secondary questions:
- Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB
I think 'do' can be made to read correctly in more contexts that 'with'. The lack of a corresponding 'while' or 'until' should eliminate any temptation to see it as a loop. The 'with' keyword also means I keep wanting the magic methods to be called "__acquire__" and "__release__" (and those would be harder to type. . .)
- I have a more elaborate proposal for __exit__'s arguments. Let the translation be as follows:
I plan to rewrite my proposal based on this suggestion, just to explore the ramifications. I think it will turn out quite nicely. The ban on yielding inside try/finally will need to be extended to yielding inside user defined statements until such time as an iterator finalisation protocol is chosen, though.
(An alternative would be to always call it with 4 arguments, the last three being None in the first two cases.)
The former is probably tidier. __exit__() method implementations which don't care about the exception details can still use "*exc_info" in the argument signature, while those that want to use the information can just name the three parts without needing to specify the "=None" defaults. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Nick Coghlan]
The ban on yielding inside try/finally will need to be extended to yielding inside user defined statements until such time as an iterator finalisation protocol is chosen, though.
Ah! Good point. This breaks PEP 340 example 5. No big deal, but worth noting. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
PEP 340 redux =============
Syntax: do EXPR [as VAR]: BLOCK
Translation: abc = EXPR [VAR =] abc.__enter__() try: BLOCK finally: abc.__exit__(*"sys.exc_info()") # Not exactly
Pros: - can use a decorated generator as EXPR - separation of EXPR and VAR (VAR gets what EXPR.__enter__() returns)
Cons: - slightly less simple (__enter__ must return something for VAR; __exit__ takes optional args)
what happened to the original "yield the target object" solution? or did I just dream that? </F>

[Guido van Rossum]
Cons: - slightly less simple (__enter__ must return something for VAR; __exit__ takes optional args)
[Fredrik Lundh]
what happened to the original "yield the target object" solution? or did I just dream that?
Don't worry, that works when you use a generator. It just doesn't work when you're using a class. The do-statement proposal is a bit ambiguous: on the one hand it's not strongly tied to generators, since you can easily write a class with __enter__ and __exit__ methods; on the other hand its essential difference from PEP 310 is that you *can* use a generator, given a suitable decorator. BTW, we need a name for such a decorated generator that only yields once. I propose to call it a degenerator. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
BTW, we need a name for such a decorated generator that only yields once. I propose to call it a degenerator.
Cute, but 'template generator' may be clearer (that's what I've been calling them so far, anyway). Then 'iterator generator' can be used to explicitly refer to generators intended for use in for loops. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Guido van Rossum wrote: [SNIP]
Straight up-or-down votes in the full senate are appreciated at this point.
PEP 340 redux gets my +1; I think using generators will become more obviously useful to people when, as Fredrick pointed out, your code grows more than a few lines long.
On to the secondary questions:
- Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB
-0; I can deal with either, but I just like how 'with' reads more.
- I have a more elaborate proposal for __exit__'s arguments. Let the translation be as follows:
abc = EXPR [VAR =] abc.__enter__() oke = False # Pronounced "okay" exc = () try: try: BLOCK oke = True except: exc = sys.exc_info() raise finally: abc.__exit__(oke, *exc)
This means that __exit__ can be called with the following arguments:
abc.__exit__(True) - normal completion of BLOCK
abc.__exit__(False) - BLOCK was left by a non-local goto (break/continue/return)
abc.__exit__(False, t, v, tb) - BLOCK was left by an exception
(An alternative would be to always call it with 4 arguments, the last three being None in the first two cases.)
If we adopt PEP 340 redux, it's up to the decorator for degenerate generators to decide how to pass this information into the generator; if we adopt PEP 342 ("continue EXPR") at the same time, we can let the yield-expression return a 4-tuple (oke, t, v, tb). Most templates can ignore this information (so they can just use a yield-statement).
I think a later email discussed just passing in the values from sys.exc_info(), and I like that more since it will be None if no exception was raised and thus straight-forward to detect without being overly verbose with the oke argument. -Brett

+1 PEP 340 redux (although I marginally prefer the "with" keyword) Guido van Rossum wrote:
So then the all-important question I want to pose is: do we like the idea of using a (degenerate, decorated) generator as a "template" for the do-statement enough to accept the slightly increased complexity? The added complexity is caused by the need to separate VAR from EXPR so that a generator can be used. I personally like this separation; I actually like that the "anonymous block controller" is logically separate from the variable bound by the construct. From Greg Ewing's response to the proposal to endow file objects with __enter__ and __exit__ methods, I believe he thinks so too.
+1 to support template generators. I think it allows for more flexibility on the class implementations as well, even if most of them return self.
- Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB
As a former pascal/delphi guy, I wasn't really confused. ;) +1 with +0 do
- I have a more elaborate proposal for __exit__'s arguments. Let the translation be as follows:
+1 -- as long as I can get information about how the block exited, I'm happy.
If we adopt PEP 340 redux, it's up to the decorator for degenerate generators to decide how to pass this information into the generator; if we adopt PEP 342 ("continue EXPR") at the same time, we can let the yield-expression return a 4-tuple (oke, t, v, tb). Most templates can ignore this information (so they can just use a yield-statement).
+1 makes sense to me...

Guido van Rossum wrote:
So then the all-important question I want to pose is: do we like the idea of using a (degenerate, decorated) generator as a "template" for the do-statement enough to accept the slightly increased complexity?
I can't see how this has anything to do with whether a generator is used or not. Keeping them separate seems to be a useful thing in its own right. Greg

[Guido van Rossum]
So then the all-important question I want to pose is: do we like the idea of using a (degenerate, decorated) generator as a "template" for the do-statement enough to accept the slightly increased complexity?
[Greg Ewing]
I can't see how this has anything to do with whether a generator is used or not. Keeping them separate seems to be a useful thing in its own right.
Assuming by "them" you mean the value of EXPR and the value assigned to VAR, I don't care how this conclusion is reached, as long as their separation is seen as a useful thing. :-) I came up with the idea of making them separate when I tried to figure out how to decorate a generator to drive a PEP-310-style with-statement, and found I couldn't do it for the opening() example. (Michael Hudson illustrated this nicely in his reply in this thread. :-) But it's fine if the separation is considered generally useful even without thinking of generators. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

On Fri, 13 May 2005, Guido van Rossum wrote:
Straight up-or-down votes in the full senate are appreciated at this point.
I prefer the "PEP 340 redux" version. Both the flexibility for __enter__ to return a separate object and the ability for __exit__ to react to exceptions are useful.
- Today I like the 'do' keyword better; 'with' might confuse folks coming from Pascal or VB
I prefer 'with'. The 'do' keyword sounds "loopy" and doesn't make grammatical sense. -- ?!ng

[Guido]
Going for all-out simplicity, I would like to be able to write these examples:
class locking: def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self, *args): self.lock.release()
class opening: def __init__(self, filename): self.filename = filename def __enter__(self): self.f = open(self.filename); return self.f def __exit__(self, *args): self.f.close()\
And do EXPR as VAR: BLOCK would mentally be translated into
itr = EXPR VAR = itr.__enter__() try: BLOCK finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't defined by finally
Yeah, that's what I wanted to do too. That should be about what my second suggestion did. Slightly updated, it looks like: stmt = EXPR1 VAR1 = stmt.__enter__() exc = () # or (None, None, None) if you prefer try: try: BLOCK1 except: exc = sys.exc_info() finally: if stmt.__exit__(*exc) is not None: raise exc[0], exc[1], exc[2] The only difference should be that with the above semantics if you return a (non-None) value from __exit__, the exception will be suppressed (that is, it will not be reraised). Means that if you want to suppress an exception, you have to add a return statement (but if you want exceptions to be reraised, you don't have to do anything.) I suggest this only because there were a few suggested use-cases for suppressing exceptions. OTOH, almost all my uses are just try/finally's so I'm certainly not going to cry if that last finally instead looks like: finally: stmt.__exit__(*exc) raise exc[0], exc[1], exc[2] =) STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

Guido van Rossum wrote:
class locking: def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self, *args): self.lock.release()
class opening: def __init__(self, filename): self.filename = filename def __enter__(self): self.f = open(self.filename); return self.f def __exit__(self, *args): self.f.close()\
And do EXPR as VAR: BLOCK would mentally be translated into
itr = EXPR VAR = itr.__enter__() try: BLOCK finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't defined by finally
In this example locking's __enter__ does not return anything. Would do EXPR: BLOCK also be legal syntax? -eric

On Wed, 11 May 2005, Guido van Rossum wrote:
[Steven Bethard]
exc = () try: try: BLOCK1 except: exc = sys.exc_info() finally: stmt_exit(*exc)
would this make any of the examples impossible to write? All you have to do to suppress an exception is to not reraise it in __exit__.
But this use case would contain a trap for the unwary user who is writing an __exit__ method -- they have to remember to re-raise an exception if it was passed in, but that's easy to forget (and slightly tricky since you have to check the arg count or whether the first argument is not None).
Then wouldn't it be simplest to separate normal exit from exceptional exit? That is, split __exit__ into __except__ and __finally__. If __except__ is defined, then it handles the exception, otherwise the exception is raised normally.
class locking: def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self, *args): self.lock.release()
Having __exit__ take varargs is a signal to me that it mashes together what really are two different methods. -- ?!ng

Guido van Rossum wrote:
Going for all-out simplicity, I would like to be able to write these examples:
class locking: def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self, *args): self.lock.release()
class opening: def __init__(self, filename): self.filename = filename def __enter__(self): self.f = open(self.filename); return self.f def __exit__(self, *args): self.f.close()\
And do EXPR as VAR: BLOCK would mentally be translated into
itr = EXPR VAR = itr.__enter__() try: BLOCK finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't defined by finally
If it's this simple, it should be possible to write something that combines the acquisition of multiple resources in a single statement. For example: with combining(opening(src_fn), opening(dst_fn, 'w')) as src, dst: copy(src, dst) I think the following class would do it. class combining: def __init__(self, *resources): self.resources = resources self.entered = 0 def __enter__(self): results = [] try: for r in self.resources: results.append(r.__enter__()) self.entered += 1 return results except: # exit resources before re-raising the exception self.__exit__() raise def __exit__(self, *args): last_exc = None # exit only the resources successfully entered to_exit = self.resources[:self.entered] while to_exit: r = to_exit.pop() try: r.__exit__(*args) except: # re-raise the exception after exiting the others last_exc = sys.exc_info() if last_exc is not None: raise last_exc[0], last_exc[1], last_exc[2] Would that work? Shane

[Shane Hathaway]
If it's this simple, it should be possible to write something that combines the acquisition of multiple resources in a single statement. For example:
with combining(opening(src_fn), opening(dst_fn, 'w')) as src, dst: copy(src, dst)
Yeah (and I don't see anything wrong with your implementation of combining either), but even if that existed I think I'd prefer to just write with opening(src_fn) as src: with opening(dst_fn) as dst: copy(src, dst) See Ma, no magic! :-) -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Steven Bethard wrote:
On 5/11/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
The gist is that the alternative is to require an __exit__() method to raise TerminateBlock in order to suppress an exception.
So I didn't see any examples that really needed TerminateBlock to suppress an exception.
Yeah, I figured out a tidier way to handle it after reading Phillip's message earlier today. My idea is similar to your second solution, but with an early exit via break, continue or return still indicated to the __exit__() method via TerminateBlock so that examples like transaction() continue to do the right thing: the_stmt = EXPR1 stmt_enter = getattr(the_stmt, "__enter__", None) stmt_exit = getattr(the_stmt, "__exit__", None) if stmt_enter is None or stmt_exit is None: raise TypeError("User defined statement template required") terminate = True VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause try: try: BLOCK1 except TerminateBlock: raise # Disallow suppression of TerminateBlock except: terminate = False if not stmt_exit(*sys.exc_info()): raise else: terminate = False stmt_exit() finally: if terminate: try: stmt_exit(TerminateBlock, None, None) except TerminateBlock: pass Version 1.5 uses these updated semantics, and the suggested generator __exit__() method semantics are adjusted appropriately. I've also added a paragraph in Open Issues about removing the ability to suppress exceptions as Guido has suggested. However, I'm hoping his objections are based on the assorted horrible mechanisms I used in versions before this one - he is quite right that forcing every __exit__() method to reraise exceptions was a rather ugly wart. The new version also fixes a typo in the auto_retry example that a couple of people pointed out, and adds a non-exception related example from Arnold deVos. The URL is the same as before: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

On 5/12/05, Nick Coghlan <ncoghlan@gmail.com> wrote:
Yeah, I figured out a tidier way to handle it after reading Phillip's message earlier today. My idea is similar to your second solution, but with an early exit via break, continue or return still indicated to the __exit__() method via TerminateBlock so that examples like transaction() continue to do the right thing:
Do they? I don't write enough transactional code, but I would have thought that break, continue or return would have been a normal, expected exit from the do-statement and therefore should do a db.commit(), not a db.rollback(). Do you think you could add an example of how the transaction do-statement would be used in such a way that these would be the desired semantics? Thanks, STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

A quick response to various small issues... - Benji York proposes that file and lock objects (for example) could have suitable __enter__ and __exit__ methods (__enter__ would have to return self). Right! - Greg Ewing (I believe) wants 'do' instead of 'with' for the keyword. I think I like 'with' better, especially combining it with Benji's proposal. IMO this reads better with 'with' than with 'do': with open("/etc/passwd") as f: for line in f: ... - Steve Bethard has this example: stmt = EXPR1 VAR1 = stmt.__enter__() exc = () # or (None, None, None) if you prefer try: try: BLOCK1 except: exc = sys.exc_info() finally: if stmt.__exit__(*exc) is not None: raise exc[0], exc[1], exc[2] but he seems to forget that finally *always* re-raises the exception. Anyway, I still don't care for the use case; rather than fixing the coding bug, your time would be better spent arguing why this functionality can't be missed. - Eric Nieuwland asks if the VAR is still optional. Yes, it is (this is implicit in the entire set of threads). - Paul Moore wants the generator templates to explicitly contain try/finally (or try/except, depending on the use case). That's much more work though (passing exceptions into a generator is a new feature) and is not necessary to get the "redux" version. - Ka-ping Yee thinks we need separate entry points for the exceptional and the normal termination case. I disagree; this would end up in unnecessary duplication of code (or boilerplate to equate the two methods) in most cases. The whole *point* is that finally gets to do its clean-up act regardless of whether an exception is being processed or not. The varargs signature to __exit__ was just me being lazy instead of typing def __exit__(self, t=None, v=None, tb=None): ... - Nick is still peddling his much more complicated variant. I recommend that he focuses on arguing use cases rather than semantic subtleties, or else it won't get any traction (at least not with me :-). --Guido van Rossum (home page: http://www.python.org/~guido/)

On 5/12/05, Guido van Rossum <gvanrossum@gmail.com> wrote:
- Steve Bethard has this example:
stmt = EXPR1 VAR1 = stmt.__enter__() exc = () # or (None, None, None) if you prefer try: try: BLOCK1 except: exc = sys.exc_info() finally: if stmt.__exit__(*exc) is not None: raise exc[0], exc[1], exc[2]
but he seems to forget that finally *always* re-raises the exception.
Not if except catches it: py> try: ... try: ... raise Exception ... except: ... exc = sys.exc_info() ... finally: ... pass ... py> As I understand it, finally only re-raises the exception if it wasn't already caught. And I have to use an except block above to catch it so that sys.exc_info() returns something other than (None, None, None).
Anyway, I still don't care for the use case; rather than fixing the coding bug, your time would be better spent arguing why this functionality can't be missed.
Sorry, I'm unclear. Do you not want sys.exc_info() passed to __exit__, or do you not want the with/do-statement to be allowed to suppress exceptions? Or both? My feeling is that the mechanism for suppressing exceptions is confusing, and it would probably be better to always reraise the exception. (I included it mainly to provide a middle ground between Guido's and Nick's proposals.) That is, I prefer something like: stmt = EXPR1 VAR1 = stmt.__enter__() exc = () # or (None, None, None) if you prefer try: try: BLOCK1 except: exc = sys.exc_info() finally: stmt.__exit__(*exc) raise exc[0], exc[1], exc[2] which should really read the same as Guido's suggestion: stmt = EXPR1 VAR1 = stmt.__enter__() try: BLOCK1 finally: stmt.__exit__(*sys.exc_info()) except that since sys.exc_info() returns (None, None, None) when there wasn't an except block, this won't actually work. If we don't provide some flag that an exception occurred, the transaction example doesn't work. My feeling is that if we're going to provide any flag, we might as well provide the entire sys.exc_info(). STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy

At 07:50 AM 5/12/2005 -0700, Guido van Rossum wrote:
- Paul Moore wants the generator templates to explicitly contain try/finally (or try/except, depending on the use case). That's much more work though (passing exceptions into a generator is a new feature) and is not necessary to get the "redux" version.
Uh oh. Speaking on behalf of the "coroutine-y people" :), does this mean that we're not going to get the ability to pass exceptions into generators?

[Guido van Rossum]
- Paul Moore wants the generator templates to explicitly contain try/finally (or try/except, depending on the use case). That's much more work though (passing exceptions into a generator is a new feature) and is not necessary to get the "redux" version.
[Phillip J. Eby]
Uh oh. Speaking on behalf of the "coroutine-y people" :), does this mean that we're not going to get the ability to pass exceptions into generators?
That would be a separate proposal (PEP 288 or PEP 325). The do/with statement with its __enter__ and __exit__ APIs is entirely independent from generators. Having shown a few non-generator do/with wrappers I'm not so sure there's a lot of need for generators here; especially since generators will always have that round-peg-square-hole feeling when they're used in a non-looping context. But I still want to leave the door open for using generators as do/with-templates, hence my modification of PEP 310 (otherwise we could just accept PEP 310 as is and be done with it). For coroutines, see PEP 342. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

- Ka-ping Yee thinks we need separate entry points for the exceptional and the normal termination case. I disagree; this would end up in unnecessary duplication of code (or boilerplate to equate the two methods) in most cases. The whole *point* is that finally gets to do its clean-up act regardless of whether an exception is being processed or not.
Okay, let me back up for a second. My suggestion responded to your reply to Steve Bethard's example about exception handling. The point of the suggestion is that *if* we are going to let "with" do exception handling, it should be done in a separate method. I didn't mean to imply that __finally__ should be skipped. This brings us back to the question of whether "with" should be able to handle exceptions. On this, you wrote:
For try/finally we have a large body of use cases that just scream for abstraction. I'm not convinced that we have the same for try/except.
So let's look at some use cases. I've thought of two; the first one is nice and simple, and the second one is messier so i'll discuss it in a separate message. Example 1: Exception Chaining. As has been previously discussed, the information from an exception can be lost when the handling of the exception runs into a problem. It is often helpful to preserve the original reason for the problem. Suppose, by convention, that the "reason" attribute on exception objects is designated for this purpose. The assignment of this attribute can be conveniently abstracted using a "with" clause as follows: try: # ... risky operation ... except: with reason(sys.exc_info()): # ... cleanup ... The "with reason" construct would be implemented like this: class reason: def __init__(self, etype, evalue, etb): self.reason = etype, evalue, etb def __except__(self, etype, evalue, etb): evalue.reason = self.reason raise etype, evalue, etb (Other possible names for "reason" might be "cause" or "context".) -- ?!ng

[Ka-Ping Yee]
Example 1: Exception Chaining.
As has been previously discussed, the information from an exception can be lost when the handling of the exception runs into a problem. It is often helpful to preserve the original reason for the problem. [example deleted]
This problem is universal -- every except clause (in theory) can have this problem. I'd much rather deal with this in a systematic way in the Python VM's exception handling machinery. Modifying every potentially affected except clause to use some additional boilerplate doesn't strike me as a good solution. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

On Thu, 12 May 2005, Guido van Rossum wrote:
[Ka-Ping Yee]
Example 1: Exception Chaining.
As has been previously discussed, the information from an exception can be lost when the handling of the exception runs into a problem. It is often helpful to preserve the original reason for the problem. [example deleted]
This problem is universal -- every except clause (in theory) can have this problem. I'd much rather deal with this in a systematic way in the Python VM's exception handling machinery.
That's reasonable. Unless another use case comes up, i withdraw my suggestion for a separate __except__ method. I hope the examples were interesting, anyhow. -- ?!ng

Here's another use case to think about. Example 2: Replacing a File. Suppose we want to reliably replace a file. We require that either: (a) The file is completely replaced with the new contents; or (b) the filesystem is unchanged and a meaningful exception is thrown. We'd like to be able to write this conveniently as: with replace(filename) as file: ... file.write(spam) ... file.write(eggs) ... To make sure the file is never only partially written, we rely on the filesystem to rename files atomically, so the basic steps (without error handling) look like this: tempname = filename + '.tmp' file = open(tempname, 'w') ... file.write(spam) ... file.close() os.rename(tempname, filename) We would like to make sure the temporary file is cleaned up and no filehandles are left open. Here's my analysis of all the execution paths we need to cover: 1. +open +write +close +rename 2. +open +write +close -rename ?remove 3. +open +write -close ?remove 4. +open -write +close ?remove 5. +open -write -close ?remove 6. -open (In this list, + means success, - means failure, ? means don't care.) When i add error handling, i get this: tempname = filename + '.tmp' file = open(tempname, 'w') # okay to let exceptions escape problem = None try: try: ... file.write(spam) ... except: problem = sys.exc_info() raise problem finally: try: file.close() except Exception, exc: problem, problem.reason = exc, problem if not problem: try: os.rename(tempname, filename) except Exception, exc: problem, problem.reason = exc, problem if problem: try: os.remove(tempname) except Exception, exc: problem, problem.reason = exc, problem raise problem In this case, the implementation of replace() doesn't require a separate __except__ method: class replace: def __init__(self, filename): self.filename = filename self.tempname = '%s.%d.%d' % (self.filename, os.getpid(), id(self)) def __enter__(self): self.file = open(self.tempname, 'w') return self def write(self, data): self.file.write(data) def __exit__(self, *problem): try: self.file.close() except Exception, exc: problem, problem.reason = exc, problem if not problem: # commit try: os.rename(tempname, filename) except Exception, exc: problem, problem.reason = exc, problem if problem: # rollback try: os.remove(tempname) except Exception, exc: problem, problem.reason = exc, problem raise problem This is all so intricate i'm not sure if i got it right. Somebody let me know if this looks right or not. (In any case, i look forward to the day when i can rely on someone else to get it right, and they only have to write it once!) -- ?!ng

At 03:00 PM 5/12/2005 -0500, Ka-Ping Yee wrote:
This is all so intricate i'm not sure if i got it right. Somebody let me know if this looks right or not. (In any case, i look forward to the day when i can rely on someone else to get it right, and they only have to write it once!)
It looks fine, but it's not a use case for suppressing exceptions, nor was the exception-chaining example. Really, the only use case for suppressing exceptions is to, well, suppress exceptions that are being logged, shown to the user, sent via email, or just plain ignored. Guido's point, I think, is that these use cases are rare enough (yet simple and similar enough) that they don't deserve support from the cleanup facility, and instead should use a try/except block. After reviewing the cases in my own code where I might've used a 'with logged_exceptions()' or similar blocks, I think I now agree. The difference between: try: BLOCK except: logger.exception(...) and: with log_errors(logger): BLOCK doesn't seem worth the effort, especially since this pattern just doesn't occur that often, compared to resource-using blocks. What *your* use cases seem to illustrate, however, is that it's quite possible that an __exit__ might well need to contain complex error handling of its own, including the need to throw a different exception than the one that was passed in.

Phillip J. Eby wrote:
Really, the only use case for suppressing exceptions is to, well, suppress exceptions that are being logged, shown to the user, sent via email, or just plain ignored. Guido's point, I think, is that these use cases are rare enough (yet simple and similar enough) that they don't deserve support from the cleanup facility, and instead should use a try/except block.
Particularly since the *action* can be factored out into a do statement - it's only the *suppression* that has to be reproduced inline. That is: try: do standard_reaction(): pass except MyException: pass
After reviewing the cases in my own code where I might've used a 'with logged_exceptions()' or similar blocks, I think I now agree.
I think I'm convinced, too. The only actual use case in the PEP is auto_retry, and that can be more obviously written with a try/except block inside the loop: for retry in reversed(xrange(num_attempts)): try: make_attempt() break except IOError: if not retry: raise Not as pretty perhaps, but the control flow is far easier to see. Steven has also convinced me that break, continue and return should look like normal exits rather than exceptions. This should bring my next PEP draft back to something resembling Guido's option 3 - the __exit__() method still gets passed the contents of sys.exc_info(), it just can't do anything about it other than raise a different exception. Cheers, Nick. P.S. The points regarding non-local flow control in Joel Spolsky's latest Joel on Software article (especially the links at the end) may have had something to do with my change of heart. . . -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Guido van Rossum wrote:
- Greg Ewing (I believe) wants 'do' instead of 'with' for the keyword. I think I like 'with' better, especially combining it with Benji's proposal. IMO this reads better with 'with' than with 'do':
with open("/etc/passwd") as f: for line in f: ...
I don't think I like the idea of giving the file object itself __enter__ and __exit__ methods, because it doesn't ensure that the opening and closing are done as a pair. It would permit the following kind of mistake: f = open("somefile") with f: do_something() with f: do_something_else() which our proposed construct, if it is any good, should be able to prevent. Also I don't at all agree that "with open(...)" reads better; on the contrary, it seems ungrammatical. Especially when compared with the very beautiful "do opening(...)", which I would be disappointed to give up. I still also have reservations about "with" on the grounds that we're making it mean something very different to what it means in most other languages that have a "with". -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

Paul Moore wrote:
PS Apologies if I missed the discussion of this in the PEP - as I say, I've only skimmed it so far.
With Guido's latest comments, it looks like this is going to be going into the "Open Issues" section - his current inclination is that do statements should only abstract finally clauses, not arbitrary exception handling. I believe he's misinterpreting the cause of the pushback against PEP 340 (I think it was the looping that was found objectionable, not the ability to suppress exceptions), but *shrug* :) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

[Nick Coghlan]
With Guido's latest comments, it looks like this is going to be going into the "Open Issues" section - his current inclination is that do statements should only abstract finally clauses, not arbitrary exception handling. I believe he's misinterpreting the cause of the pushback against PEP 340 (I think it was the looping that was found objectionable, not the ability to suppress exceptions), but *shrug* :)
I realize that the pushback was against looping, but whereas in the PEP 340 proposal general exception handling comes out naturally, it feels as an ugly wart in the modified PEP 310 proposal. Plus I think the use cases are much weaker (even PEP 340 doesn't have many use cases for exception handling -- the only one is the auto-retry example). -- --Guido van Rossum (home page: http://www.python.org/~guido/)

On 5/11/05, Guido van Rossum <gvanrossum@gmail.com> wrote:
I realize that the pushback was against looping, but whereas in the PEP 340 proposal general exception handling comes out naturally, it feels as an ugly wart in the modified PEP 310 proposal.
Plus I think the use cases are much weaker (even PEP 340 doesn't have many use cases for exception handling -- the only one is the auto-retry example).
Accepted, but I still feel that the templates should explicitly include the try...finally, rather than simply having the yield mark the split. The examples seem more readable that way. Explicit is better than implicit, and all that... Paul.

Guido van Rossum wrote:
I used to dislike this, but the opposition and the proliferation of alternative proposals have made me realize that I'd rather have this (plus "continue EXPR" which will be moved to a separate PEP once I have some extra time) than most of the other proposals.
Draft 1.3 of my PEP 310/PEP 340 merger is now up for public consumption: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html This is a major rewrite since version 1.2. Significant changes are: - reorder and reword things to emphasise the user defined statements, and their ability to factor out arbitrary try/except/else/finally boilerplate. - use 'do' as the keyword instead of 'stmt' (I have to say, I *really* like the way 'do' looks and reads in all of the examples) - clean up the semantics of user defined statements so as to make manual statement templates as easy to write as those in PEP 310 - give iterator finalisation its own slot, __finish__() (distinct from the __exit__() of statement templates) - define sensible automatic finalisation semantics for iterative loops - fill out the Rejected Options section meaningfully, with justifications for rejecting certain options - makes the PEP more self-contained, with significantly fewer references to PEP 340. These changes try to take into account the feedback I got on the previous drafts, as well as fixing a few problems I noticed myself. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Nick Coghlan wrote:
Draft 1.3 of my PEP 310/PEP 340 merger is now up for public consumption: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
This draft was meant to drop the idea of __enter__() raising TerminateBlock preventing execution of the statement body. I dropped it out of the code describing the semantics, but the idea is still mentioned in the text. I'll probably do another draft to fix that, and various ReST problems tomorrow night. I'll also add in a justification for why I chose the single __exit__ method over separate __else__, __except__ and __finally__ methods. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.blogspot.com

Guido van Rossum wrote:
Now it would make more sense to change the syntax to
with EXPR as VAR: BLOCK
and we have Phillip Eby's proposal.
Change the 'with' to 'do' and I'd be +1 on this. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing@canterbury.ac.nz +--------------------------------------+

[Guido]
Now it would make more sense to change the syntax to
with EXPR as VAR: BLOCK
and we have Phillip Eby's proposal.
[Greg]
Change the 'with' to 'do' and I'd be +1 on this.
Great! But the devil is in the details. I want to reduce the complexity, and I'm willing to reduce the functionality somewhat. It would help if you could support this. In particular, while I like the use case transactional() (example 3 in PEP 340) enough to want some indicator of success or failure, I don't see the point of having separate __except__, __else__ and __finally__ entry points. It's unclear how these would be mapped to the generator API, and whether more than one could be called e.g. what if __else__ raises an exception itself -- will __finally__ be called? I'm sure that could be specified rigorously, but I'm not so sure that it is going to be useful and clear. I see several possible levels of information that could be passed to the __exit__ call: (1) None. This is insufficient for the transactional() use case. (2) Only a bool indicating success or failure. This is sufficient for the transactional() use case. (3) Full exception information, with the understanding that when __exit__() returns normally, exception processing will resume as usual (i.e. __exit__() is called from a finally clause). Exceptions raised from __exit__() are considered errors/bugs in __exit__() and should be avoided by robust __exit__() methods. (4) Like (3), but also distinguish between non-local gotos (break/continue/return), exceptions, and normal completion of BLOCK. (I'm not sure that this is really different from (3).) What do you think? -- --Guido van Rossum (home page: http://www.python.org/~guido/)

At 10:00 AM 5/11/2005 -0700, Guido van Rossum wrote:
(3) Full exception information, with the understanding that when __exit__() returns normally, exception processing will resume as usual (i.e. __exit__() is called from a finally clause). Exceptions raised from __exit__() are considered errors/bugs in __exit__() and should be avoided by robust __exit__() methods.
FYI, there are still use cases for clearing the exception state in an __exit__ method, that might justify allowing a true return from __exit__ to suppress the error. e.g.: class Attempt(object): def __init__(self,type,last=False): self.type = type self.last = last def __enter__(self): pass def __exit__(self,*exc): if exc and not last and issubclass(exc[0],self.type): # suppress exception return True def retry(count, type=Exception): attempt = Attempt(type) for i in range(count-1): yield attempt yield Attempt(type, True) # usage: for attempt in retry(3): do attempt: somethingThatCouldFail() and: class logging_exceptions(object): def __init__(self,logger): self.logger = logger def __enter__(self): pass def __exit__(self,*exc): if exc: # log and suppress error self.logger.error("Unexpected error", exc_info=exc) return True while True: do logging_exceptions(my_logger): processEvents()

[Phillip J. Eby]
FYI, there are still use cases for clearing the exception state in an __exit__ method, that might justify allowing a true return from __exit__ to suppress the error. e.g.: [...]
Yes, but aren't those written clearer using an explicit try/except? IMO anything that actually stops an exception from propagating outward is worth an explicit try/except clause, so the reader knows what is happening. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

At 10:42 AM 5/11/2005 -0700, Guido van Rossum wrote:
[Phillip J. Eby]
FYI, there are still use cases for clearing the exception state in an __exit__ method, that might justify allowing a true return from __exit__ to suppress the error. e.g.: [...]
Yes, but aren't those written clearer using an explicit try/except? IMO anything that actually stops an exception from propagating outward is worth an explicit try/except clause, so the reader knows what is happening.
I thought the whole point of PEP 340 was to allow abstraction and reuse of patterns that currently use "try" blocks, including "except" as well as "finally". So, if you're only going to allow try/finally abstraction, wouldn't it make more sense to call it __finally__ instead of __exit__? That would make it clearer that this is purely for try/finally patterns, and not error handling patterns. As for whether they're written more clearly using an explicit try/except, I don't know. Couldn't you say exactly the same thing about explicit try/finally? For that matter, if you used function calls, doesn't it produce the same issue, e.g.: def retry(count,exc_type=Exception): def attempt(func): try: func() except exc_type: pass for i in range(count-1): yield attempt yield lambda f: f() for attempt in retry(3): attempt(somethingThatMightFail) Is this bad style too?

[Phillip J. Eby]
FYI, there are still use cases for clearing the exception state in an __exit__ method, that might justify allowing a true return from __exit__ to suppress the error. e.g.: [...]
[Guido]
Yes, but aren't those written clearer using an explicit try/except? IMO anything that actually stops an exception from propagating outward is worth an explicit try/except clause, so the reader knows what is happening.
[Phillip]
I thought the whole point of PEP 340 was to allow abstraction and reuse of patterns that currently use "try" blocks, including "except" as well as "finally".
Indeed it was. But I'm getting a lot of pushback on the PEP so I'm exploring a simpler proposal with a more limited use case -- that of PEP 310, basically. Note that generators written for this new proposal do not contain try/finally or try/except (and don't need it); they simply contain some actions before the yield and some actions after it, and the with/do/block/stmt statement takes care of calling it.
So, if you're only going to allow try/finally abstraction, wouldn't it make more sense to call it __finally__ instead of __exit__? That would make it clearer that this is purely for try/finally patterns, and not error handling patterns.
I don't think the name matteres that much; __exit__ doesn't particularly mean "error handling" to me. I think PEP 310 proposed __enter__ and __exit__ and I was just following that; I've also thought of __enter__/__leave__ or even the nostalgic __begin__/__end__.
As for whether they're written more clearly using an explicit try/except, I don't know. Couldn't you say exactly the same thing about explicit try/finally?
For try/finally we have a large body of use cases that just scream for abstraction. I'm not convinced that we have the same for try/except. Maybe the key is this: with try/finally, the control flow is unaffected whether the finally clause is present or not, so hiding it from view doesn't matter much for understanding the code; in fact in my mind when I see a try/finally clause I mentally translate it to something that says "that resource is held for the duration of this block" so I can stop thinking about the details of releasing that resource. try/except, on the other hand, generally changes the control flow, and there is much more variety in the except clause. I don't think the need for abstraction is the same.
For that matter, if you used function calls, doesn't it produce the same issue, e.g.:
def retry(count,exc_type=Exception): def attempt(func): try: func() except exc_type: pass for i in range(count-1): yield attempt yield lambda f: f()
for attempt in retry(3): attempt(somethingThatMightFail)
Is this bad style too?
Yes (not to mention that the retry def is unreadable and that the 'attempt' callable feels magical). There are many ways of coding retry loops and seeing the bottom two lines (the use) in isolation doesn't give me an idea of what happens when the 3rd attempt fails. Here, EIBTI. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Phillip J. Eby wrote:
FYI, there are still use cases for clearing the exception state in an __exit__ method, that might justify allowing a true return from __exit__ to suppress the error. e.g.:
Maybe __exit__ could suppress exceptions using a new idiom: def __exit__(self,*exc): if exc and not last and issubclass(exc[0],self.type): # suppress the exception raise None This seems clearer than "return True". Shane

At 12:44 PM 5/11/2005 -0600, Shane Hathaway wrote:
Phillip J. Eby wrote:
FYI, there are still use cases for clearing the exception state in an __exit__ method, that might justify allowing a true return from __exit__ to suppress the error. e.g.:
Maybe __exit__ could suppress exceptions using a new idiom:
def __exit__(self,*exc): if exc and not last and issubclass(exc[0],self.type): # suppress the exception raise None
This seems clearer than "return True".
Nice, although perhaps a little too cute. But it's moot as Guido has vetoed the whole idea.
participants (17)
-
Alex Martelli
-
Benji York
-
Bill Janssen
-
Brett C.
-
Eric Nieuwland
-
Fredrik Lundh
-
Greg Ewing
-
Guido van Rossum
-
Ka-Ping Yee
-
Michael Hudson
-
Michele Simionato
-
Nick Coghlan
-
Paul Moore
-
Phillip J. Eby
-
Shane Hathaway
-
Shane Holloway (IEEE)
-
Steven Bethard