Using yield inside a comprehension.

Where do I find the PEP that describes that the following statement assigns a generator object to `values`? values = [ (yield x) for x in range(10) ] I assume it's equivalent to the following: values = (x for x in range(10)) The reason for asking this is that I see no point in using the first syntax and that the first has another meaning in Python 2.7, if used inside a function.

On 26 November 2013 13:54, Jonathan Slenders <jonathan@slenders.be> wrote:
Where do I find the PEP that describes that the following statement assigns a generator object to `values`?
I don't think there was a PEP for this but it's a consequence of the change to binding in list comprehensions introduced in Python 3.x which is mentioned here: http://python-history.blogspot.co.uk/2010/06/from-list-comprehensions-to-gen... Essentially this:
values = [ (yield x) for x in range(10) ]
Translates to the following in Python 2.x: _tmp = [] for x in range(10): _tmp.append((yield x)) values = _tmp However in 3.x it translates to something like: def _tmpfunc(): _tmp = [] for x in range(10): _tmp.append((yield x)) return _tmp values = _tmpfunc() This change was made to prevent the comprehension variable from leaking to the enclosing scope, but as you say if the code is nested in a function then it affects which function contains the yield statement and the presence of a yield statement radically alters the behaviour of a function. So in 2.x the enclosing function must become a generator function. However in 3.x the function that is supposed to implement the list comprehension is changed into a generator function instead. $ python3 Python 3.3.2 (v3.3.2:d047928ae3f6, May 16 2013, 00:03:43) [MSC v.1600 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information.
I assume it's equivalent to the following: values = (x for x in range(10))
It will yield the same values but it will also build a list of Nones and attach it to StopIteration:
This has been mentioned before and AFAICT it is an unintended artefact of the new definitions for comprehensions and yield and the fact that 'return _tmp' works inside a generator function. I haven't seen a use for this behaviour yet. Oscar

On Nov 26, 2013, at 6:32, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
Unless you call .send on it, in which case it'll build a list of the values you send it and attach _that_ to StopIteration, of course. So I suppose you could use it as a coroutine version of the list function. Except that the number of values it takes is specified on the wrong end.

Thanks Oscar, for the extensive explanation. The translation process is clear to me. But I'm not convinced that yield or yield from should be bound to this invisible function instead of the parent. Now that asyncio/tulip is coming, I see some use case. Suppose that you have a list of futures, and you want to process them. The following works, using the "gather" function. @coroutine def process(futures, func): values = yield from gather(futures) return [ func(v) for v in values ] What I tried to do, with my knowledge of Python 2 generators, is the following, but that doesn't work. @coroutine def process(futures, func): return [ func((yield from f)) for f in futures ] They are also not equivalent. Using gather(), we have to wait before all the futures are ready, while in the second example, we could start creating the list on the fly. (Note that in this case "yield" instead of "yield from" would also work.) Futher, there is a really weird constructs possible: list((yield x) for x in range(10)) [0, None, 1, None, 2, None, 3, None, 4, None, 5, None, 6, None, 7, None, 8, None, 9, None] It's a logical consequence from the translation, but I don't get the point. Can't we create a local namespace, without wrapping it in a function? 2013/11/26 Andrew Barnert <abarnert@yahoo.com>

On Nov 26, 2013, at 10:22, Jonathan Slenders <jonathan@slenders.be> wrote:
You can see an early side effect in "func" here, which could also trigger exceptions early.
Well, yes, you can see a side effect of calling func in func. But not a side effect of building the list. And, more importantly, you don't (or shouldn't) use list comprehensions for expressions that you're evaluating for their side effects. That's what for statements are for.

On 27 Nov 2013 01:35, "Jonathan Slenders" <jonathan@slenders.be> wrote: point.
Can't we create a local namespace, without wrapping it in a function?
That was the original implementation I tried, and it turned out to be inordinately difficult to get the semantics right for lambda expressions that referenced iteration variables from inside the comprehension. There are also some ugly edge cases involving the locals() builtin that would need to have their semantics defined. Switching to a full lexical scope instead turned out to be much easier to implement while still providing closure semantics that matched those of generator expressions, so that's what I ended up implementing. This approach also resolved the "How does locals() work in a 3.x comprehension?" question in favour of making it work the same way it does in a generator expression. As others have noted, this approach of using an implicit function definition also results in the behaviour of yield expressions inside comprehensions being entirely consistent with their behaviour inside def statements and lambda expressions - it turns an ordinary function into a generator function. Is this particularly useful? Not that I've seen (while you can do some kinda neat one-liner hacks with it, they're basically incomprehensible to the reader). It's just a natural consequence of making comprehension semantics more consistent with those of generator expressions (which were already consistent with nested def statements and lambda expressions), and there's no compelling justification to disallow it. Regarding the PEP question, there's no dedicated PEP for the change, just a line item in PEP 3100 to make comprehensions more like generator expressions by hiding the iteration variable. There's probably a thread or two on the old python-3000 list about using a full function scope to do it, though (I seem to recall posting about it after my original pseudo-scope based approach failed to handle closures properly). Cheers, Nick.
2013/11/26 Andrew Barnert <abarnert@yahoo.com>
On Nov 26, 2013, at 6:32, Oscar Benjamin <oscar.j.benjamin@gmail.com>
wrote:

On 26 November 2013 13:54, Jonathan Slenders <jonathan@slenders.be> wrote:
Where do I find the PEP that describes that the following statement assigns a generator object to `values`?
I don't think there was a PEP for this but it's a consequence of the change to binding in list comprehensions introduced in Python 3.x which is mentioned here: http://python-history.blogspot.co.uk/2010/06/from-list-comprehensions-to-gen... Essentially this:
values = [ (yield x) for x in range(10) ]
Translates to the following in Python 2.x: _tmp = [] for x in range(10): _tmp.append((yield x)) values = _tmp However in 3.x it translates to something like: def _tmpfunc(): _tmp = [] for x in range(10): _tmp.append((yield x)) return _tmp values = _tmpfunc() This change was made to prevent the comprehension variable from leaking to the enclosing scope, but as you say if the code is nested in a function then it affects which function contains the yield statement and the presence of a yield statement radically alters the behaviour of a function. So in 2.x the enclosing function must become a generator function. However in 3.x the function that is supposed to implement the list comprehension is changed into a generator function instead. $ python3 Python 3.3.2 (v3.3.2:d047928ae3f6, May 16 2013, 00:03:43) [MSC v.1600 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information.
I assume it's equivalent to the following: values = (x for x in range(10))
It will yield the same values but it will also build a list of Nones and attach it to StopIteration:
This has been mentioned before and AFAICT it is an unintended artefact of the new definitions for comprehensions and yield and the fact that 'return _tmp' works inside a generator function. I haven't seen a use for this behaviour yet. Oscar

On Nov 26, 2013, at 6:32, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
Unless you call .send on it, in which case it'll build a list of the values you send it and attach _that_ to StopIteration, of course. So I suppose you could use it as a coroutine version of the list function. Except that the number of values it takes is specified on the wrong end.

Thanks Oscar, for the extensive explanation. The translation process is clear to me. But I'm not convinced that yield or yield from should be bound to this invisible function instead of the parent. Now that asyncio/tulip is coming, I see some use case. Suppose that you have a list of futures, and you want to process them. The following works, using the "gather" function. @coroutine def process(futures, func): values = yield from gather(futures) return [ func(v) for v in values ] What I tried to do, with my knowledge of Python 2 generators, is the following, but that doesn't work. @coroutine def process(futures, func): return [ func((yield from f)) for f in futures ] They are also not equivalent. Using gather(), we have to wait before all the futures are ready, while in the second example, we could start creating the list on the fly. (Note that in this case "yield" instead of "yield from" would also work.) Futher, there is a really weird constructs possible: list((yield x) for x in range(10)) [0, None, 1, None, 2, None, 3, None, 4, None, 5, None, 6, None, 7, None, 8, None, 9, None] It's a logical consequence from the translation, but I don't get the point. Can't we create a local namespace, without wrapping it in a function? 2013/11/26 Andrew Barnert <abarnert@yahoo.com>

On Nov 26, 2013, at 10:22, Jonathan Slenders <jonathan@slenders.be> wrote:
You can see an early side effect in "func" here, which could also trigger exceptions early.
Well, yes, you can see a side effect of calling func in func. But not a side effect of building the list. And, more importantly, you don't (or shouldn't) use list comprehensions for expressions that you're evaluating for their side effects. That's what for statements are for.

On 27 Nov 2013 01:35, "Jonathan Slenders" <jonathan@slenders.be> wrote: point.
Can't we create a local namespace, without wrapping it in a function?
That was the original implementation I tried, and it turned out to be inordinately difficult to get the semantics right for lambda expressions that referenced iteration variables from inside the comprehension. There are also some ugly edge cases involving the locals() builtin that would need to have their semantics defined. Switching to a full lexical scope instead turned out to be much easier to implement while still providing closure semantics that matched those of generator expressions, so that's what I ended up implementing. This approach also resolved the "How does locals() work in a 3.x comprehension?" question in favour of making it work the same way it does in a generator expression. As others have noted, this approach of using an implicit function definition also results in the behaviour of yield expressions inside comprehensions being entirely consistent with their behaviour inside def statements and lambda expressions - it turns an ordinary function into a generator function. Is this particularly useful? Not that I've seen (while you can do some kinda neat one-liner hacks with it, they're basically incomprehensible to the reader). It's just a natural consequence of making comprehension semantics more consistent with those of generator expressions (which were already consistent with nested def statements and lambda expressions), and there's no compelling justification to disallow it. Regarding the PEP question, there's no dedicated PEP for the change, just a line item in PEP 3100 to make comprehensions more like generator expressions by hiding the iteration variable. There's probably a thread or two on the old python-3000 list about using a full function scope to do it, though (I seem to recall posting about it after my original pseudo-scope based approach failed to handle closures properly). Cheers, Nick.
2013/11/26 Andrew Barnert <abarnert@yahoo.com>
On Nov 26, 2013, at 6:32, Oscar Benjamin <oscar.j.benjamin@gmail.com>
wrote:
participants (4)
-
Andrew Barnert
-
Jonathan Slenders
-
Nick Coghlan
-
Oscar Benjamin