Revised^4 PEP on yield-from
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Fifth draft of the PEP. Re-worded a few things slightly to hopefully make the proposal a bit clearer up front. Anyone have any further suggested changes before I sent it to the pepmeister? PEP: XXX Title: Syntax for Delegating to a Subgenerator Version: $Revision$ Last-Modified: $Date$ Author: Gregory Ewing <greg.ewing@canterbury.ac.nz> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 13-Feb-2009 Python-Version: 2.7 Post-History: Abstract ======== A syntax is proposed for a generator to delegate part of its operations to another generator. This allows a section of code containing 'yield' to be factored out and placed in another generator. Additionally, the subgenerator is allowed to return with a value, and the value is made available to the delegating generator. The new syntax also opens up some opportunities for optimisation when one generator re-yields values produced by another. Proposal ======== The following new expression syntax will be allowed in the body of a generator: :: yield from <expr> where <expr> is an expression evaluating to an iterable, from which an iterator is extracted. The iterator is run to exhaustion, during which time it behaves as though it were communicating directly with the caller of the generator containing the ``yield from`` expression (the "delegating generator"). When the iterator is another generator, the effect is the same as if the body of the subgenerator were inlined at the point of the 'yield from' expression. The subgenerator is allowed to execute a 'return' statement with a value, and that value becomes the value of the 'yield from' expression. In terms of the iterator protocol: * Any values that the iterator yields are passed directly to the caller. * Any values sent to the delegating generator using ``send()`` are sent directly to the iterator. (If the iterator does not have a ``send()`` method, it remains to be decided whether the value sent in is ignored or an exception is raised.) * Calls to the ``throw()`` method of the delegating generator are forwarded to the iterator. (If the iterator does not have a ``throw()`` method, the thrown-in exception is raised in the delegating generator.) * If the delegating generator's ``close()`` method is called, the iterator is finalised before finalising the delegating generator. * The value of the ``yield from`` expression is the first argument to the ``StopIteration`` exception raised by the iterator when it terminates. * ``return expr`` in a generator is equivalent to ``raise StopIteration(expr)``. Formal Semantics ---------------- The statement :: result = yield from expr is semantically equivalent to :: _i = iter(expr) try: _u = _i.next() while 1: try: _v = yield _u except Exception, _e: if hasattr(_i, 'throw'): _i.throw(_e) else: raise else: if hasattr(_i, 'send'): _u = _i.send(_v) else: _u = _i.next() except StopIteration, _e: _a = _e.args if len(_a) > 0: result = _a[0] else: result = None finally: if hasattr(_i, 'close'): _i.close() Rationale ========= A Python generator is a form of coroutine, but has the limitation that it can only yield to its immediate caller. This means that a piece of code containing a ``yield`` cannot be factored out and put into a separate function in the same way as other code. Performing such a factoring causes the called function to itself become a generator, and it is necessary to explicitly iterate over this second generator and re-yield any values that it produces. If yielding of values is the only concern, this is not very arduous and can be performed with a loop such as :: for v in g: yield v However, if the subgenerator is to interact properly with the caller in the case of calls to ``send()``, ``throw()`` and ``close()``, things become considerably more complicated. As the formal expansion presented above illustrates, the necessary code is very longwinded, and it is tricky to handle all the corner cases correctly. In this situation, the advantages of a specialised syntax should be clear. Generators as Threads --------------------- A motivating use case for generators being able to return values concerns the use of generators to implement lightweight threads. When using generators in that way, it is reasonable to want to spread the computation performed by the lightweight thread over many functions. One would like to be able to call a subgenerator as though it were an ordinary function, passing it parameters and receiving a returned value. Using the proposed syntax, a statement such as :: y = f(x) where f is an ordinary function, can be transformed into a delegation call :: y = yield from g(x) where g is a generator. One can reason about the behaviour of the resulting code by thinking of g as an ordinary function that can be suspended using a ``yield`` statement. When using generators as threads in this way, typically one is not interested in the values being passed in or out of the yields. However, there are use cases for this as well, where the thread is seen as a producer or consumer of items. The ``yield from`` expression allows the logic of the thread to be spread over as many functions as desired, with the production or consumption of items occuring in any subfunction, and the items are automatically routed to or from their ultimate source or destination. Concerning ``throw()`` and ``close()``, it is reasonable to expect that if an exception is thrown into the thread from outside, it should first be raised in the innermost generator where the thread is suspended, and propagate outwards from there; and that if the thread is terminated from outside by calling ``close()``, the chain of active generators should be finalised from the innermost outwards. Syntax ------ The particular syntax proposed has been chosen as suggestive of its meaning, while not introducing any new keywords and clearly standing out as being different from a plain ``yield``. Optimisations ------------- Using a specialised syntax opens up possibilities for optimisation when there is a long chain of generators. Such chains can arise, for instance, when recursively traversing a tree structure. The overhead of passing ``next()`` calls and yielded values down and up the chain can cause what ought to be an O(n) operation to become O(n\*\*2). A possible strategy is to add a slot to generator objects to hold a generator being delegated to. When a ``next()`` or ``send()`` call is made on the generator, this slot is checked first, and if it is nonempty, the generator that it references is resumed instead. If it raises StopIteration, the slot is cleared and the main generator is resumed. This would reduce the delegation overhead to a chain of C function calls involving no Python code execution. A possible enhancement would be to traverse the whole chain of generators in a loop and directly resume the one at the end, although the handling of StopIteration is more complicated then. Use of StopIteration to return values ------------------------------------- There are a variety of ways that the return value from the generator could be passed back. Some alternatives include storing it as an attribute of the generator-iterator object, or returning it as the value of the ``close()`` call to the subgenerator. However, the proposed mechanism is attractive for a couple of reasons: * Using the StopIteration exception makes it easy for other kinds of iterators to participate in the protocol without having to grow extra attributes or a close() method. * It simplifies the implementation, because the point at which the return value from the subgenerator becomes available is the same point at which StopIteration is raised. Delaying until any later time would require storing the return value somewhere. Criticisms ========== Under this proposal, the value of a ``yield from`` expression would be derived in a very different way from that of an ordinary ``yield`` expression. This suggests that some other syntax not containing the word ``yield`` might be more appropriate, but no acceptable alternative has so far been proposed. It has been suggested that some mechanism other than ``return`` in the subgenerator should be used to establish the value returned by the ``yield from`` expression. However, this would interfere with the goal of being able to think of the subgenerator as a suspendable function, since it would not be able to return values in the same way as other functions. The use of an argument to StopIteration to pass the return value has been criticised as an "abuse of exceptions", without any concrete justification of this claim. In any case, this is only one suggested implementation; another mechanism could be used without losing any essential features of the proposal. Alternative Proposals ===================== Proposals along similar lines have been made before, some using the syntax ``yield *`` instead of ``yield from``. While ``yield *`` is more concise, it could be argued that it looks too similar to an ordinary ``yield`` and the difference might be overlooked when reading code. To the author's knowledge, previous proposals have focused only on yielding values, and thereby suffered from the criticism that the two-line for-loop they replace is not sufficiently tiresome to write to justify a new syntax. By also dealing with calls to ``send()``, ``throw()`` and ``close()``, this proposal provides considerably more benefit. Copyright ========= This document has been placed in the public domain. .. Local Variables: mode: indented-text indent-tabs-mode: nil sentence-end-double-space: t fill-column: 70 coding: utf-8 End:
![](https://secure.gravatar.com/avatar/db5f70d2f2520ef725839f046bdc32fb.jpg?s=120&d=mm&r=g)
Greg Ewing <greg.ewing@...> writes:
Using a specialised syntax opens up possibilities for optimisation when there is a long chain of generators. Such chains can arise, for instance, when recursively traversing a tree structure. The overhead of passing ``next()`` calls and yielded values down and up the chain can cause what ought to be an O(n) operation to become O(n\*\*2).
It should be relatively easy to avoid O(n**2) behaviour when traversing a tree, so I find this argument quite artificial.
It has been suggested that some mechanism other than ``return`` in the subgenerator should be used to establish the value returned by the ``yield from`` expression. However, this would interfere with the goal of being able to think of the subgenerator as a suspendable function, since it would not be able to return values in the same way as other functions.
The problem I have with allowing "return" in generators is that it makes things much more confusing (try explaining a beginner that he has the right to return a value from a generator but the value can't be retrieved through any conventional means: a "for" loop, a builtin function or method consuming the iterator, etc.). I can imagine it being useful in some Twisted-like situation: the inner generator would first yield a bunch of intermediate Deferreds to wait for the completion of some asynchronous thing, and then return the final Deferred for its caller to retrieve. But I think it wouldn't need the "yield from" construct to function, just the "return" thing. Regards Antoine.
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Antoine Pitrou wrote:
It should be relatively easy to avoid O(n**2) behaviour when traversing a tree,
How?
The problem I have with allowing "return" in generators is that it makes things much more confusing (try explaining a beginner that he has the right to return a value from a generator but the value can't be retrieved through any conventional means
I don't think it will be any harder than explaining why they get a syntax error if they try to return something from a generator at present. -- Greg
![](https://secure.gravatar.com/avatar/db5f70d2f2520ef725839f046bdc32fb.jpg?s=120&d=mm&r=g)
Greg Ewing <greg.ewing@...> writes:
It should be relatively easy to avoid O(n**2) behaviour when traversing a tree,
How?
By doing the traversal iteratively rather than recursively. Well, I admit the following function took a couple of attempts to get right: def traverse_depth_first(tree): stack = [] yield tree.value it = iter(tree.children) while True: try: child = it.next() except StopIteration: if not stack: raise it, tree = stack.pop() else: stack.append((it, tree)) tree = child yield tree.value it = iter(tree.children)
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Antoine Pitrou wrote:
By doing the traversal iteratively rather than recursively. Well, I admit the following function took a couple of attempts to get right:
It's also a totally unreasonable amount of obfuscation to endure just to be able to traverse the tree with a generator. -- Greg
![](https://secure.gravatar.com/avatar/db5f70d2f2520ef725839f046bdc32fb.jpg?s=120&d=mm&r=g)
Greg Ewing <greg.ewing@...> writes:
It's also a totally unreasonable amount of obfuscation to endure just to be able to traverse the tree with a generator.
Greg, I find this qualification ("obfuscation") a bit offensive... It's certainly a matter of taste, and, while it's less straightforward than an explicitly recusive traversal, I don't find that particular chunk of code obfuscated at all. To me, it's not harder to understand than the various examples of "yield from" use you have posted to justify that feature. (and, actually, I don't understand how "yield from" helps for a depth-first traversal. Could you post an example of it?) Regards Antoine.
![](https://secure.gravatar.com/avatar/ad90d1670b18a863f4d246e817b242b4.jpg?s=120&d=mm&r=g)
Antoine Pitrou wrote:
(and, actually, I don't understand how "yield from" helps for a depth-first traversal. Could you post an example of it?) Antoine, I expect something like:
def traverse_depth_first(tree): yield tree.value for child in tree.children: yield from traverse_depth_first(child) to be semantically equivalent and *much* easier to read than your version. If we use the expansion listed in the PEP as the implementation of "yield from", we have the O(n**2) performance mentioned. I *know* we can do better than that, but I don't (yet) know enough about the python internals to tell you how. I am +1 on the PEP assuming we find a way around the O(n**2) behavior, +0.75 if not :) Regards Jacob
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Antoine Pitrou wrote:
Greg, I find this qualification ("obfuscation") a bit offensive...
Sorry, I don't mean that personally. The fact is that it does look obfuscated to my eyes, and I'd be surprised if I were the only person who thinks so.
(and, actually, I don't understand how "yield from" helps for a depth-first traversal. Could you post an example of it?)
Traversing a binary tree with a non-generator: def traverse(node): if node: process_node(node) traverse(node.left) traverse(node.right) Traversing it with a generator: def traverse(node): if node: yield process_node(node) yield from traverse(node.left) yield from traverse(node.right) Do you still think an unrolled version would be equally clear? If so, you have extremely different tastes from me! -- Greg
![](https://secure.gravatar.com/avatar/db5f70d2f2520ef725839f046bdc32fb.jpg?s=120&d=mm&r=g)
Greg Ewing <greg.ewing@...> writes:
Traversing it with a generator:
def traverse(node): if node: yield process_node(node) yield from traverse(node.left) yield from traverse(node.right)
Do you still think an unrolled version would be equally clear? If so, you have extremely different tastes from me!
Of course, I admit the "yield from" version is simpler :) However, if there isn't a specialized optimization in the interpreter, it will also probably be slower (because it switches between frames a lot, which is likely expensive, although I don't know of any timings). Besides, my point was to show that you didn't /need/ "yield from" to write a linear traversal generator, and the 15 or so lines of that generator are sufficiently generic to be reused from project to project. (my opinion on your PEP being that it brings the complication inside the interpreter itself, especially if you want to implement the feature in an optimized way. I haven't read the scheduler example yet, though...) Regards Antoine.
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Antoine Pitrou wrote:
However, if there isn't a specialized optimization in the interpreter, it will also probably be slower (because it switches between frames a lot
That's true, but I'm 99.9% sure that if it's implemented at all then it will be implemented fairly efficiently, because doing so is actually easier than implementing it inefficiently.:-)
(my opinion on your PEP being that it brings the complication inside the interpreter itself, especially if you want to implement the feature in an optimized way.
Bringing it into the interpreter is what makes it possible, and fairly straightforward, to implement it efficiently. Hopefully this will become clearer when I get a reference implementation going. -- Greg
![](https://secure.gravatar.com/avatar/047f2332cde3730f1ed661eebb0c5686.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 4:48 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Antoine Pitrou wrote:
However, if there isn't a specialized optimization in the interpreter, it will also probably be slower (because it switches between frames a lot
That's true, but I'm 99.9% sure that if it's implemented at all then it will be implemented fairly efficiently, because doing so is actually easier than implementing it inefficiently.:-)
Hey Greg, I think your efforts now should be focused on the implementation and not on continued arguing with unbelievers. Code speaks. -- --Guido van Rossum (home page: http://www.python.org/~guido/)
![](https://secure.gravatar.com/avatar/59e949bfecc07f30b3a8988961436518.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 5:53 PM, Guido van Rossum <guido@python.org> wrote:
Hey Greg, I think your efforts now should be focused on the implementation and not on continued arguing with unbelievers. Code speaks.
Speaking of code... # trampoline implementation class trampoline(object): def __init__(self, func, instance=None): self.func = func self.instance = instance def __get__(self, obj, type=None): return trampoline(self.func, obj) def __call__(self, *args, **kwargs): if self.instance is not None: args = (self.instance,) + args return trampolineiter(self.func(*args, **kwargs)) class trampolineiter(object): def __init__(self, iterable): self.stack = [iterable] def __iter__(self): return self def next(self): while True: try: x = self.stack[-1].next() except StopIteration: self.stack.pop(-1) if not self.stack: raise else: if isinstance(x, trampolineiter): assert len(x.stack) == 1 self.stack.append(x.stack[0]) elif isinstance(x, leaf): return x.data else: raise TypeError("Unexpected type yielded to trampoline") class leaf(object): def __init__(self, data): self.data = data # Example usage class Tree(object): def __init__(self, name, left, right): self.name = name self.left = left self.right = right @trampoline def __iter__(self): if self: yield leaf(self) yield traverse(self.left) yield traverse(self.right)
mytree = Tree(0, Tree(1, None, Tree(2, None, None)), Tree(3, None, None)) for node in mytree: ... print "Found:", node.name ... Found: 0 Found: 1 Found: 2 Found: 3
-- Adam Olsen, aka Rhamphoryncus
![](https://secure.gravatar.com/avatar/59e949bfecc07f30b3a8988961436518.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 5:24 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Traversing a binary tree with a non-generator:
def traverse(node): if node: process_node(node) traverse(node.left) traverse(node.right)
Traversing it with a generator:
def traverse(node): if node: yield process_node(node) yield from traverse(node.left) yield from traverse(node.right)
Do you still think an unrolled version would be equally clear? If so, you have extremely different tastes from me!
This is a pretty good example, IMO. However, I'd like to see what a trampoline would look like to support something like this: @trampoline def traverse(node): if node: yield leaf(process_node(node)) yield traverse(node.left) yield traverse(node.right) If the use case is sufficiently common we can consider putting such a trampoline in the stdlib. If not it should at least go in the cookbook. And FWIW, a C implementation of such a trampoline should be almost identical to what the PEP proposes. It's just substituting a type check for the new syntax. -- Adam Olsen, aka Rhamphoryncus
![](https://secure.gravatar.com/avatar/047f2332cde3730f1ed661eebb0c5686.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 2:12 AM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Fifth draft of the PEP. Re-worded a few things slightly to hopefully make the proposal a bit clearer up front.
Wow, how I long for the days when we routinely put things like this under revision control so its easy to compare versions. -- --Guido van Rossum (home page: http://www.python.org/~guido/)
![](https://secure.gravatar.com/avatar/395215a635f89ee4fd6d9dfe8453afae.jpg?s=120&d=mm&r=g)
[Aside to Guido: Oops, I think I accidentally sent you a contentless reply. Sorry!] As a suggestion, I think this is relevant to everybody who might be writing a PEP, so I'm cross-posting to Python-Dev. Probably no discussion is needed, but Reply-To is set to Python-Ideas. On Python-Ideas, Guido van Rossum writes:
On Thu, Feb 19, 2009 at 2:12 AM, Greg Ewing wrote:
Fifth draft of the PEP. Re-worded a few things slightly to hopefully make the proposal a bit clearer up front.
Wow, how I long for the days when we routinely put things like this under revision control so its easy to compare versions.
FWIW, Google Docs is almost there. Working with Brett et al on early drafts of PEP 0374 was easy and pleasant, and Google Docs gives control of access to the document to the editor, not the Subversion admin. The ability to make comments that are not visible to non-editors was nice. Now that it's in Subversion it's much less convenient for me (a non-committer). I actually have to *decide* to work on it, rather than simply raising a browser window, hitting "refresh" and fixing a typo or two (then back to "day job" work). The main problem with Google Docs is that is records a revision automatically every so often (good) but doesn't prune the automatic commits (possibly hard to do efficiently) OR mark user saves specially (easy to do). This lack of marking "important" revisions makes the diff functionality kind of tedious. I don't know how automatic the conversion to reST was, but the PEP in Subversion is a quite accurate conversion of the Google Doc version. Overall, I recommend use of Google Docs for "Python-Ideas" level of PEP drafts.
![](https://secure.gravatar.com/avatar/af7d333323f6b8fecf1fc8d4acf792f5.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 7:17 PM, Stephen J. Turnbull <turnbull@sk.tsukuba.ac.jp> wrote:
On Python-Ideas, Guido van Rossum writes:
On Thu, Feb 19, 2009 at 2:12 AM, Greg Ewing wrote:
Fifth draft of the PEP. Re-worded a few things slightly to hopefully make the proposal a bit clearer up front.
Wow, how I long for the days when we routinely put things like this under revision control so its easy to compare versions.
FWIW, Google Docs is almost there. Working with Brett et al on early drafts of PEP 0374 was easy and pleasant, and Google Docs gives control of access to the document to the editor, not the Subversion admin. The ability to make comments that are not visible to non-editors was nice. Now that it's in Subversion it's much less convenient for me (a non-committer). I actually have to *decide* to work on it, rather than simply raising a browser window, hitting "refresh" and fixing a typo or two (then back to "day job" work).
The main problem with Google Docs is that is records a revision automatically every so often (good) but doesn't prune the automatic commits (possibly hard to do efficiently) OR mark user saves specially (easy to do). This lack of marking "important" revisions makes the diff functionality kind of tedious.
I don't know how automatic the conversion to reST was, but the PEP in Subversion is a quite accurate conversion of the Google Doc version.
Overall, I recommend use of Google Docs for "Python-Ideas" level of PEP drafts.
Rietveld would also be a good option: it offers more at-will revision control (rather than "whenever Google Docs decides"), allows you to attach comments to the revisions, and will give you nice diffs between PEP iterations. Collin
![](https://secure.gravatar.com/avatar/204c57008faeec2f2a4b0846093b7535.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 10:17 PM, Stephen J. Turnbull <turnbull@sk.tsukuba.ac.jp> wrote:
Overall, I recommend use of Google Docs for "Python-Ideas" level of PEP drafts.
+1! I also like Google Sites for collaborative editing. -- Cheers, Leif
![](https://secure.gravatar.com/avatar/db5f70d2f2520ef725839f046bdc32fb.jpg?s=120&d=mm&r=g)
Greg Ewing <greg.ewing@...> writes:
Use of StopIteration to return values -------------------------------------
Why not a dedicated exception (e.g. GeneratorReturn) instead? Two advantages to doing so: * people mistakingly doing a "for" loop over such a generator would be reminded that they are missing something (the return value) * you could take advantage of existing iterator-consuming features (e.g. "yield from map(str, innergenerator())"), since they would just forward the exception instead of silencing it Regards Antoine.
![](https://secure.gravatar.com/avatar/047f2332cde3730f1ed661eebb0c5686.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 7:28 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
Greg Ewing <greg.ewing@...> writes:
Use of StopIteration to return values -------------------------------------
Why not a dedicated exception (e.g. GeneratorReturn) instead? Two advantages to doing so: * people mistakingly doing a "for" loop over such a generator would be reminded that they are missing something (the return value) * you could take advantage of existing iterator-consuming features (e.g. "yield from map(str, innergenerator())"), since they would just forward the exception instead of silencing it
Seconded -- but I would make it inherit from StopIteration so that the for-loop (unless modified) would just ignore it. -- --Guido van Rossum (home page: http://www.python.org/~guido/)
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Guido van Rossum wrote:
Seconded -- but I would make it inherit from StopIteration so that the for-loop (unless modified) would just ignore it.
But is there really any good reason to use a different exception? Currently, 'return' without a value in an ordinary function is equivalent to 'return None'. If this is done, they wouldn't be equivalent in generators, since they would raise different exceptions. -- Greg
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Antoine Pitrou wrote:
Why not a dedicated exception (e.g. GeneratorReturn) instead? Two advantages to doing so: * people mistakingly doing a "for" loop over such a generator would be reminded that they are missing something (the return value)
You don't get an warning that you are "missing something" if you ignore the return value from an ordinary function call, so I don't find this argument very convincing.
* you could take advantage of existing iterator-consuming features (e.g. "yield from map(str, innergenerator())"), since they would just forward the exception instead of silencing it
You also don't get ordinary return values of functions that you call forwarded to your caller, and I don't see how it would be any more useful to do so here. There is one possible reason it might be useful, and that's if you want to catch a StopIteration that may or may not have a value attached, i.e. try: # some iteration thing except GeneratorReturn, e: result = e.args[0] except StopIteration: result = None However, that would be better addressed by enhancing StopIteration with a 'value' attribute that is None if it wasn't given an argument. -- Greg
![](https://secure.gravatar.com/avatar/fc1729b62c6a1aafc15b17f6461225db.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 5:12 AM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
* ``return expr`` in a generator is equivalent to ``raise StopIteration(expr)``.
It seems to me equivalence here might not be what you want. This parallel does not exist today between "return" and "raise StopIteration()", where the former can't be intercepted and blocked by a try/except block, but the latter can. I think it would be confusing for a return statement to be swallowed by code intended as an error handler. Greg F
![](https://secure.gravatar.com/avatar/047f2332cde3730f1ed661eebb0c5686.jpg?s=120&d=mm&r=g)
On Thu, Feb 19, 2009 at 8:16 AM, Greg Falcon <veloso@verylowsodium.com> wrote:
On Thu, Feb 19, 2009 at 5:12 AM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
* ``return expr`` in a generator is equivalent to ``raise StopIteration(expr)``.
It seems to me equivalence here might not be what you want.
This parallel does not exist today between "return" and "raise StopIteration()", where the former can't be intercepted and blocked by a try/except block, but the latter can.
Technically, 'return' is treated as an uncatchable exception -- but an exception nevertheless, since you *do* get to intercept it with try/finally.
I think it would be confusing for a return statement to be swallowed by code intended as an error handler.
Only marginally though, since once the generator returns, it *does* raise StopIteration. But all in all I agree it would be better to keep the existing return semantics and only turn it into StopIteration(expr) after all try/except and try/finally blocks have been left -- IOW at the moment the frame is being cleared up. That would minimize surprises IMO. -- --Guido van Rossum (home page: http://www.python.org/~guido/)
![](https://secure.gravatar.com/avatar/72ee673975357d43d79069ac1cd6abda.jpg?s=120&d=mm&r=g)
Greg Falcon wrote:
It seems to me equivalence here might not be what you want.
This parallel does not exist today between "return" and "raise StopIteration()", where the former can't be intercepted and blocked by a try/except block, but the latter can.
Hmmm, you're right, it's not exactly equivalent. I'll adjust the wording of that part -- it's not my intention to make returns catchable. -- Greg
participants (9)
-
Adam Olsen
-
Antoine Pitrou
-
Collin Winter
-
Greg Ewing
-
Greg Falcon
-
Guido van Rossum
-
Jacob Holm
-
Leif Walsh
-
Stephen J. Turnbull