
Hi, I'm trying to follow the discussion about the PEP 479 (Change StopIteration handling inside generators), but it's hard to read all messages. I'm concerned by trollius and asyncio which heavily rely on StopIteration. Trollius currently supports running asyncio coroutines: a trollius coroutine can executes an asyncio coroutine, and and asyncio coroutine can execute a trollius coroutine. I modified the Return class of Trollius to not inherit from StopIteration. All trollius tests pass on Python 3.3 except on one (which makes me happy, the test suite is wide enough to detect bugs ;-)): test_trollius_in_asyncio. This specific test executes an asyncio which executes a trollius coroutine. https://bitbucket.org/enovance/trollius/src/873d21ac0badec36835ed24d13e2aeda... The problem is that an asyncio coroutine cannot execute a Trollius coroutine anymore: "yield from coro" raises a Return exception instead of simply "stopping" the generator and return the result (value passed to Return). I don't see how an asyncio coroutine calling "yield from trollius_coroutine" can handle the Return exception if it doesn't inherit from StopIteration. It means that I have to drop this feature in Python 3.5 (or later when the PEP 479 becomes effective)? I'm talking about the current behaviour of Python 3.3, I didn't try the PEP 479 (I don't know if an exception exists). Victor

On Thu, Nov 27, 2014 at 10:08 AM, Victor Stinner <victor.stinner@gmail.com> wrote:
The issue here is that asyncio only interprets StopIteration as returning from the generator (with a possible value), while a Trollius coroutine must use "raise Return(<value>)" to specify a return value; this works as long as Return is a subclass of StopIteration, but PEP 479 will break this by replacing the StopIteration with RuntimeError. It's an interesting puzzle. The only way out I can think of is to have asyncio special-case the Return exception -- we could do that by defining a new exception (e.g. AlternateReturn) in asyncio that gets treated the same way as StopIteration, so that Trollius can inherit from AlternateReturn (if it exists). What do you think? -- --Guido van Rossum (python.org/~guido)

2014-11-27 20:06 GMT+01:00 Guido van Rossum <guido@python.org>:
The issue here is that asyncio only interprets StopIteration as returning from the generator (with a possible value),
I'm not sure that the issue is directly related to asyncio. trollius_coro() raises a StopIteration to return the result to caller. To caller is "result = yield from coro", it's not the complex Task._step() method. So it's pure Python, except if I missed something.
I don't see how it would work. Here is a simplified example of my issue. You need to modify all "yield from coro" to write instead "yield from catch_return(coro)", or I missed something important. --- PEP479 = True if not PEP479: # trollius: no need for catch_return() before the PEP 479 class Return(StopIteration): pass else: # PEP 479: need catch_return() class Return(Exception): def __init__(self, value): self.value = value def return_value(value): if 0: yield raise Return(value) def catch_return(gen): try: value = (yield from gen) except Return as exc: return exc.value def add_one(gen): value = (yield from gen) return value + 1 def consume_generator(gen): while True: try: next(gen) except StopIteration as exc: return exc.value gen1 = return_value(3) if PEP479: gen1 = catch_return(gen1) gen2 = add_one(gen1) print(consume_generator(gen2)) --- Victor

2014-11-27 22:54 GMT+01:00 Victor Stinner <victor.stinner@gmail.com>:
I don't see how it would work.
If it cannot be fixed, would it make sense to allow trollius to continue to work as it currently works with something like "from __past__ import generator_dont_stop"? When I talked with a friend about the transition from Python 2 to Python 3, he asked me why there was not "from __past__ import division". He wants to add this to his code to not have to worry that a division may fail "somewhere" in his code. Maybe it would ease upgrades to newer versions of Python if we consider keeping the old behaviour for people who don't have time to port their old code (for no immediate benefit), but need to upgrade because newer OS only provide newer version of Python. (What is the cost of keeping the old behaviour: maintain the code and runtime overhead?) Victor

On 28 November 2014 at 08:09, Victor Stinner <victor.stinner@gmail.com> wrote:
I think between contextlib and Trollius, the case is starting to be made for raising an UnhandledStopIteration subclass of RuntimeError, rather than a generic RuntimeError. We have at least two known cases now where code that works with generators-as-coroutines has a valid reason for wanting to distinguish "arbitrary runtime error" from "unhandled StopIteration exception". While catching RuntimeError and looking for StopIteration in __cause__ *works*, it feels messier and harder to explain than just naming the concept by giving it a dedicated exception type. Trollius would still need an adapter to be called from asyncio, though. Something like: def implicit_stop(g): try: yield from g except UnhandledStopIteration as exc: return exc.__cause__.value Then Victor's example would become: class Return(StopIteration): pass def return_value(value): if 0: yield raise Return(value) def add_one(gen): value = (yield from gen) return value + 1 def consume_generator(gen): while True: try: next(gen) except StopIteration as exc: return exc.value gen1 = return_value(3) if PEP479: gen1 = implicit_stop(gen1) gen2 = add_one(gen1) print(consume_generator(gen2))
The main problem with *never* deprecating anything is an ever-increasing cognitive burden in learning the language, as well as losing the ability to read code in isolation without knowing what flags are in effect. Currently, folks that only work in Python 3 don't need to know how division worked in Python 2, or that print was ever a statement, etc. If those old behaviours could be selectively turned back on, then everyone would still need to learn them, and you couldn't review code in isolation any more: there may be a __past__ import at the top of the module making it do something different. If organisations really want to let their code bitrot (and stay on the treadmill of big expensive high risk updates every decade or so), they can, but they have to do it by running on old versions of Python as well - that gives maintainers a clear understanding that if they want to understand the code, they have to know how Python X.Y worked, rather than being able to assume modern Python. Regards, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

2014-11-28 3:49 GMT+01:00 Nick Coghlan <ncoghlan@gmail.com>:
I modified Trollius to test such idea: * Return inherits from Exception (not from StopIteration) * on Python 3, @trollius.coroutine wraps the coroutine to catch Runtimerror: if the exc.__context__ is a StopIteration, return exc.__context__.value The test suite pass with such additional coroutine wrapper on Python 3.5 patched with pep479.patch (and unpatched Python 3.3). So yes, it may help to have a new specialized exception, even if "it works" with RuntimeError. The drawback is that a new layer would make trollius even slower. Victor

off-topic , not about asyncio but related to the PEP and other things been discussed in this thread On 11/28/14, Victor Stinner <victor.stinner@gmail.com> wrote:
2014-11-28 3:49 GMT+01:00 Nick Coghlan <ncoghlan@gmail.com>:
[...]
So yes, it may help to have a new specialized exception, even if "it works" with RuntimeError.
This is somehow the situation I tried to explain in another thread about PEP 479 (though I did not use the right words) and will be a very common situation in practice .
The drawback is that a new layer would make trollius even slower.
e.g. in a (private) library I wrote for a company that's basically about composition of generators there is a situation similar to what Victor explained in this thread . I mostly would have to end-up doing one of a couple of things try: ... except RuntimeError: return which over-complicates function definition and introduces a long chain of (redundant) exception handling code just to end up raising StopIteration once again (i.e. poor performance) or ... # decorate functions in the public API # ... may be improved but you get the idea def myown_stopiter(f) def wrapper(*args, **kwargs): ... try: ... except RuntimeError as exc: if isinstance(exc.args[0], StopIteration): raise StopIteration # exc.args[0] ? else: raise ... return wrapper which is actually a re-implementation of exception matching itself Otherwise ... {{{#!py # in generator definition # rather than natural syntax for defining sequence logic raise MyOwnException(...) # decorate functions in the public API # ... may be improved but you get the idea def myown_stopiter(f) def wrapper(*args, **kwargs): ... try: ... except MyOwnException: raise StopIteration ... return wrapper }}} In the two las cases the library ends up having two functions , the one that allows (MyOwnException | RuntimeError) to bubble up (only used for defining compositions) , and the one that translates the exception (which *should* not be used for compositions, even if it will work, because of performance penalties) ... thus leading to further complications at API level ... Built-in behavior consisting in raising a subclass of RuntimeError is a much better approach similar to the second case mentioned above . This might definitely help to make less painful the process of rewriting things all over to cope with incompatibilities caused by PEP 479 , but afaict performance issues will be there for a while . -- Regards, Olemis - @olemislc Apache(tm) Bloodhound contributor http://issues.apache.org/bloodhound http://blood-hound.net Blog ES: http://simelo-es.blogspot.com/ Blog EN: http://simelo-en.blogspot.com/ Featured article:

correction ... On 11/28/14, Olemis Lang <olemis@gmail.com> wrote:
... should be {{{#!py # inside generator function body try: ... except StopIteration: return }}} [...] -- Regards, Olemis - @olemislc Apache(tm) Bloodhound contributor http://issues.apache.org/bloodhound http://blood-hound.net Blog ES: http://simelo-es.blogspot.com/ Blog EN: http://simelo-en.blogspot.com/ Featured article:

@Victor: I'm glad you found a work-around. Maybe you can let your users control it with a flag? It is often true that straddling code pays a performance cost. Hopefully the slight performance dip might be an incentive for people to start thinking about porting to asyncio. @Olemis: You never showed examples of how your code would be used, so it's hard to understand what you're trying to do and how PEP 479 affects you. On Fri, Nov 28, 2014 at 7:21 AM, Olemis Lang <olemis@gmail.com> wrote:
-- --Guido van Rossum (python.org/~guido)

On 11/28/14, Guido van Rossum <guido@python.org> wrote: [...]
@Olemis: You never showed examples of how your code would be used, so it's hard to understand what you're trying to do and how PEP 479 affects you.
The intention is not to restart the debate . PEP is approved , it's done ... but ... <comment> as a side-effect beware of the consequences that it is a fact that performance will be degraded (under certain circumstances) due to either a chain of (SI = StopIteration) raise SI => except SI: return => raise SI => ... ... or a few other similar cases which I will not describe for the sake of not repeating myself and being brief . </comment> -- Regards, Olemis - @olemislc Apache(tm) Bloodhound contributor http://issues.apache.org/bloodhound http://blood-hound.net Blog ES: http://simelo-es.blogspot.com/ Blog EN: http://simelo-en.blogspot.com/ Featured article:

On 30 November 2014 at 02:45, Olemis Lang <olemis@gmail.com> wrote:
Guido wrote a specific micro-benchmark for that case in one of the other threads. On his particular system, the overhead was around 150 ns per link in the chain at the point the data processing pipeline was shut down. In most scenarios where a data processing pipeline is worth setting up in the first place, the per-item handling costs (which won't change) are likely to overwhelm the shutdown costs (which will get marginally slower). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Sat, Nov 29, 2014 at 9:07 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
If I hadn't written that benchmark I wouldn't recognize what you're talking about here. :-) This is entirely off-topic, but if I didn't know it was about one generator calling next() to iterate over another generator, I wouldn't have understood what pattern you refer to as a data processing pipeline. And I still don't understand how the try/except *setup* cost became *shut down* cost of the pipeline. But that doesn't matter, since the number of setups equals the number of shut downs. -- --Guido van Rossum (python.org/~guido)

On Fri, Nov 28, 2014 at 8:54 AM, Victor Stinner <victor.stinner@gmail.com> wrote:
This is one known significant backward-incompatibility issue with this PEP - it'll be difficult to make this work on Python 2.7, where "return value" would be a problem, and 3.7, where "raise StopIteration" would be a problem. At present, I don't know of a solution to this. In 3.x-only code, you could simply use 'return value' directly; in 2.7 code, StopIteration doesn't seem to even *have* a .value attribute (and .args[0] has to be used instead). But I don't like the idea of a "from __past__" directive. It means backward-compatibility code has to be maintained through eternity (otherwise it just shifts the problem to "why did you remove my __past__ directive, I want a from __paster__ import division"), which means both the Python implementation code (every Python, not just CPython) needs to cope, *and* everyone who reads Python code needs to cope. For python-list, Stack Overflow, and other such coding help places, this means more questions to ask about a piece of code. For real-world usage, it means scanning back up to the top of the file every time you read something that's been affected by a __past__ directive. Plus, which __future__ directives need __past__ equivalents? Personally, I wouldn't bother making "from __past__ import lack_of_with_statement", but your friend is wanting "division", and I'm sure "print_statement" would be wanted... and, this is the one that'd split everyone and put the sides to war: "bytes_literals". Personally, I would want python-dev to say "There will NEVER be a from __past__ import bytes_literals directive", but there are going to be others who say "But my code would be so much cleaner AND faster if you do!", and IMO this is a good reason to avoid having any __past__ directives at all. ChrisA

Guido van Rossum wrote:
I don't understand. If I'm interpreting PEP 479 correctly, in 'x = yield from foo', a StopIteration raised by foo.__next__() doesn't get turned into a RuntimeError; rather it just stops the sub-iteration as usual and its value attribute gets assigned to x. As long as a Trollius coroutine behaves like something implementing the iterator protocol, it should continue to work fine with Return as a subclass of StopIteration. Or is there something non-obvious about Trollius that I'm missing? -- Greg

2014-11-28 10:12 GMT+01:00 Greg Ewing <greg.ewing@canterbury.ac.nz>:
The Trollius coroutine uses "raise Return(value)" which is basically a "raise StopIteraton(value)", and this is forbidden by the PEP 479. With the PEP 479, the StopIteration is replaced with a RuntimeError. Victor

On Fri, Nov 28, 2014 at 8:18 PM, Victor Stinner <victor.stinner@gmail.com> wrote:
The question, I guess, is: Why can't it be translated into "return value"? One answer is: Because that's not legal in Python 2.7. And I can't respond to that answer, unfortunately. That's the one major backward compat issue. (Another answer may be "Because it would require changes to many intermediate generators, not all of which are under our control". If that's the issue, then it'll simply be a matter of telling people "When you upgrade to Python 3.6, you will start to see warnings unless you make this change".) ChrisA

On Thu, Nov 27, 2014 at 10:08 AM, Victor Stinner <victor.stinner@gmail.com> wrote:
The issue here is that asyncio only interprets StopIteration as returning from the generator (with a possible value), while a Trollius coroutine must use "raise Return(<value>)" to specify a return value; this works as long as Return is a subclass of StopIteration, but PEP 479 will break this by replacing the StopIteration with RuntimeError. It's an interesting puzzle. The only way out I can think of is to have asyncio special-case the Return exception -- we could do that by defining a new exception (e.g. AlternateReturn) in asyncio that gets treated the same way as StopIteration, so that Trollius can inherit from AlternateReturn (if it exists). What do you think? -- --Guido van Rossum (python.org/~guido)

2014-11-27 20:06 GMT+01:00 Guido van Rossum <guido@python.org>:
The issue here is that asyncio only interprets StopIteration as returning from the generator (with a possible value),
I'm not sure that the issue is directly related to asyncio. trollius_coro() raises a StopIteration to return the result to caller. To caller is "result = yield from coro", it's not the complex Task._step() method. So it's pure Python, except if I missed something.
I don't see how it would work. Here is a simplified example of my issue. You need to modify all "yield from coro" to write instead "yield from catch_return(coro)", or I missed something important. --- PEP479 = True if not PEP479: # trollius: no need for catch_return() before the PEP 479 class Return(StopIteration): pass else: # PEP 479: need catch_return() class Return(Exception): def __init__(self, value): self.value = value def return_value(value): if 0: yield raise Return(value) def catch_return(gen): try: value = (yield from gen) except Return as exc: return exc.value def add_one(gen): value = (yield from gen) return value + 1 def consume_generator(gen): while True: try: next(gen) except StopIteration as exc: return exc.value gen1 = return_value(3) if PEP479: gen1 = catch_return(gen1) gen2 = add_one(gen1) print(consume_generator(gen2)) --- Victor

2014-11-27 22:54 GMT+01:00 Victor Stinner <victor.stinner@gmail.com>:
I don't see how it would work.
If it cannot be fixed, would it make sense to allow trollius to continue to work as it currently works with something like "from __past__ import generator_dont_stop"? When I talked with a friend about the transition from Python 2 to Python 3, he asked me why there was not "from __past__ import division". He wants to add this to his code to not have to worry that a division may fail "somewhere" in his code. Maybe it would ease upgrades to newer versions of Python if we consider keeping the old behaviour for people who don't have time to port their old code (for no immediate benefit), but need to upgrade because newer OS only provide newer version of Python. (What is the cost of keeping the old behaviour: maintain the code and runtime overhead?) Victor

On 28 November 2014 at 08:09, Victor Stinner <victor.stinner@gmail.com> wrote:
I think between contextlib and Trollius, the case is starting to be made for raising an UnhandledStopIteration subclass of RuntimeError, rather than a generic RuntimeError. We have at least two known cases now where code that works with generators-as-coroutines has a valid reason for wanting to distinguish "arbitrary runtime error" from "unhandled StopIteration exception". While catching RuntimeError and looking for StopIteration in __cause__ *works*, it feels messier and harder to explain than just naming the concept by giving it a dedicated exception type. Trollius would still need an adapter to be called from asyncio, though. Something like: def implicit_stop(g): try: yield from g except UnhandledStopIteration as exc: return exc.__cause__.value Then Victor's example would become: class Return(StopIteration): pass def return_value(value): if 0: yield raise Return(value) def add_one(gen): value = (yield from gen) return value + 1 def consume_generator(gen): while True: try: next(gen) except StopIteration as exc: return exc.value gen1 = return_value(3) if PEP479: gen1 = implicit_stop(gen1) gen2 = add_one(gen1) print(consume_generator(gen2))
The main problem with *never* deprecating anything is an ever-increasing cognitive burden in learning the language, as well as losing the ability to read code in isolation without knowing what flags are in effect. Currently, folks that only work in Python 3 don't need to know how division worked in Python 2, or that print was ever a statement, etc. If those old behaviours could be selectively turned back on, then everyone would still need to learn them, and you couldn't review code in isolation any more: there may be a __past__ import at the top of the module making it do something different. If organisations really want to let their code bitrot (and stay on the treadmill of big expensive high risk updates every decade or so), they can, but they have to do it by running on old versions of Python as well - that gives maintainers a clear understanding that if they want to understand the code, they have to know how Python X.Y worked, rather than being able to assume modern Python. Regards, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

2014-11-28 3:49 GMT+01:00 Nick Coghlan <ncoghlan@gmail.com>:
I modified Trollius to test such idea: * Return inherits from Exception (not from StopIteration) * on Python 3, @trollius.coroutine wraps the coroutine to catch Runtimerror: if the exc.__context__ is a StopIteration, return exc.__context__.value The test suite pass with such additional coroutine wrapper on Python 3.5 patched with pep479.patch (and unpatched Python 3.3). So yes, it may help to have a new specialized exception, even if "it works" with RuntimeError. The drawback is that a new layer would make trollius even slower. Victor

off-topic , not about asyncio but related to the PEP and other things been discussed in this thread On 11/28/14, Victor Stinner <victor.stinner@gmail.com> wrote:
2014-11-28 3:49 GMT+01:00 Nick Coghlan <ncoghlan@gmail.com>:
[...]
So yes, it may help to have a new specialized exception, even if "it works" with RuntimeError.
This is somehow the situation I tried to explain in another thread about PEP 479 (though I did not use the right words) and will be a very common situation in practice .
The drawback is that a new layer would make trollius even slower.
e.g. in a (private) library I wrote for a company that's basically about composition of generators there is a situation similar to what Victor explained in this thread . I mostly would have to end-up doing one of a couple of things try: ... except RuntimeError: return which over-complicates function definition and introduces a long chain of (redundant) exception handling code just to end up raising StopIteration once again (i.e. poor performance) or ... # decorate functions in the public API # ... may be improved but you get the idea def myown_stopiter(f) def wrapper(*args, **kwargs): ... try: ... except RuntimeError as exc: if isinstance(exc.args[0], StopIteration): raise StopIteration # exc.args[0] ? else: raise ... return wrapper which is actually a re-implementation of exception matching itself Otherwise ... {{{#!py # in generator definition # rather than natural syntax for defining sequence logic raise MyOwnException(...) # decorate functions in the public API # ... may be improved but you get the idea def myown_stopiter(f) def wrapper(*args, **kwargs): ... try: ... except MyOwnException: raise StopIteration ... return wrapper }}} In the two las cases the library ends up having two functions , the one that allows (MyOwnException | RuntimeError) to bubble up (only used for defining compositions) , and the one that translates the exception (which *should* not be used for compositions, even if it will work, because of performance penalties) ... thus leading to further complications at API level ... Built-in behavior consisting in raising a subclass of RuntimeError is a much better approach similar to the second case mentioned above . This might definitely help to make less painful the process of rewriting things all over to cope with incompatibilities caused by PEP 479 , but afaict performance issues will be there for a while . -- Regards, Olemis - @olemislc Apache(tm) Bloodhound contributor http://issues.apache.org/bloodhound http://blood-hound.net Blog ES: http://simelo-es.blogspot.com/ Blog EN: http://simelo-en.blogspot.com/ Featured article:

correction ... On 11/28/14, Olemis Lang <olemis@gmail.com> wrote:
... should be {{{#!py # inside generator function body try: ... except StopIteration: return }}} [...] -- Regards, Olemis - @olemislc Apache(tm) Bloodhound contributor http://issues.apache.org/bloodhound http://blood-hound.net Blog ES: http://simelo-es.blogspot.com/ Blog EN: http://simelo-en.blogspot.com/ Featured article:

@Victor: I'm glad you found a work-around. Maybe you can let your users control it with a flag? It is often true that straddling code pays a performance cost. Hopefully the slight performance dip might be an incentive for people to start thinking about porting to asyncio. @Olemis: You never showed examples of how your code would be used, so it's hard to understand what you're trying to do and how PEP 479 affects you. On Fri, Nov 28, 2014 at 7:21 AM, Olemis Lang <olemis@gmail.com> wrote:
-- --Guido van Rossum (python.org/~guido)

On 11/28/14, Guido van Rossum <guido@python.org> wrote: [...]
@Olemis: You never showed examples of how your code would be used, so it's hard to understand what you're trying to do and how PEP 479 affects you.
The intention is not to restart the debate . PEP is approved , it's done ... but ... <comment> as a side-effect beware of the consequences that it is a fact that performance will be degraded (under certain circumstances) due to either a chain of (SI = StopIteration) raise SI => except SI: return => raise SI => ... ... or a few other similar cases which I will not describe for the sake of not repeating myself and being brief . </comment> -- Regards, Olemis - @olemislc Apache(tm) Bloodhound contributor http://issues.apache.org/bloodhound http://blood-hound.net Blog ES: http://simelo-es.blogspot.com/ Blog EN: http://simelo-en.blogspot.com/ Featured article:

On 30 November 2014 at 02:45, Olemis Lang <olemis@gmail.com> wrote:
Guido wrote a specific micro-benchmark for that case in one of the other threads. On his particular system, the overhead was around 150 ns per link in the chain at the point the data processing pipeline was shut down. In most scenarios where a data processing pipeline is worth setting up in the first place, the per-item handling costs (which won't change) are likely to overwhelm the shutdown costs (which will get marginally slower). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Sat, Nov 29, 2014 at 9:07 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
If I hadn't written that benchmark I wouldn't recognize what you're talking about here. :-) This is entirely off-topic, but if I didn't know it was about one generator calling next() to iterate over another generator, I wouldn't have understood what pattern you refer to as a data processing pipeline. And I still don't understand how the try/except *setup* cost became *shut down* cost of the pipeline. But that doesn't matter, since the number of setups equals the number of shut downs. -- --Guido van Rossum (python.org/~guido)

On Fri, Nov 28, 2014 at 8:54 AM, Victor Stinner <victor.stinner@gmail.com> wrote:
This is one known significant backward-incompatibility issue with this PEP - it'll be difficult to make this work on Python 2.7, where "return value" would be a problem, and 3.7, where "raise StopIteration" would be a problem. At present, I don't know of a solution to this. In 3.x-only code, you could simply use 'return value' directly; in 2.7 code, StopIteration doesn't seem to even *have* a .value attribute (and .args[0] has to be used instead). But I don't like the idea of a "from __past__" directive. It means backward-compatibility code has to be maintained through eternity (otherwise it just shifts the problem to "why did you remove my __past__ directive, I want a from __paster__ import division"), which means both the Python implementation code (every Python, not just CPython) needs to cope, *and* everyone who reads Python code needs to cope. For python-list, Stack Overflow, and other such coding help places, this means more questions to ask about a piece of code. For real-world usage, it means scanning back up to the top of the file every time you read something that's been affected by a __past__ directive. Plus, which __future__ directives need __past__ equivalents? Personally, I wouldn't bother making "from __past__ import lack_of_with_statement", but your friend is wanting "division", and I'm sure "print_statement" would be wanted... and, this is the one that'd split everyone and put the sides to war: "bytes_literals". Personally, I would want python-dev to say "There will NEVER be a from __past__ import bytes_literals directive", but there are going to be others who say "But my code would be so much cleaner AND faster if you do!", and IMO this is a good reason to avoid having any __past__ directives at all. ChrisA

Guido van Rossum wrote:
I don't understand. If I'm interpreting PEP 479 correctly, in 'x = yield from foo', a StopIteration raised by foo.__next__() doesn't get turned into a RuntimeError; rather it just stops the sub-iteration as usual and its value attribute gets assigned to x. As long as a Trollius coroutine behaves like something implementing the iterator protocol, it should continue to work fine with Return as a subclass of StopIteration. Or is there something non-obvious about Trollius that I'm missing? -- Greg

2014-11-28 10:12 GMT+01:00 Greg Ewing <greg.ewing@canterbury.ac.nz>:
The Trollius coroutine uses "raise Return(value)" which is basically a "raise StopIteraton(value)", and this is forbidden by the PEP 479. With the PEP 479, the StopIteration is replaced with a RuntimeError. Victor

On Fri, Nov 28, 2014 at 8:18 PM, Victor Stinner <victor.stinner@gmail.com> wrote:
The question, I guess, is: Why can't it be translated into "return value"? One answer is: Because that's not legal in Python 2.7. And I can't respond to that answer, unfortunately. That's the one major backward compat issue. (Another answer may be "Because it would require changes to many intermediate generators, not all of which are under our control". If that's the issue, then it'll simply be a matter of telling people "When you upgrade to Python 3.6, you will start to see warnings unless you make this change".) ChrisA
participants (6)
-
Chris Angelico
-
Greg Ewing
-
Guido van Rossum
-
Nick Coghlan
-
Olemis Lang
-
Victor Stinner