[Python-ideas] Proto-PEP on a 'yield from' statement
Bruce Frederiksen
dangyogi at gmail.com
Fri Feb 13 22:08:24 CET 2009
Bruce Leban wrote:
> I didn't follow all the variations on the for loop, but regarding
> send, it seems to me that a natural case is this:
>
> for x in foo:
> bar = process(x)
> foo.send(bar)
>
> which sends the value bar to the generator and the value that comes
> back is used in the next iteration of the loop. I know that what I
> wrote doesn't do that so what I really mean is something like this but
> easier to write:
>
> try:
> x = foo.next()
> while True:
> bar = process(x)
> x = foo.send(bar)
> except StopIteration:
> pass
>
> and the syntax that occurs to me is:
>
> for x in foo:
> bar = process(x)
> continue bar
In thinking more about this, here's how it's shaping up.
The current generators become "old-style generators" and a "new-style
generator" be added. The new-style generator is the same as the
old-style generators w.r.t. next/throw/close and yield *statements*.
But new-style generators separate the operations of getting values into
the generator vs getting values out of the generator. Thus, the yield
*expression* is not allowed in new style generators and is replaced by a
some kind of marker (reserved word, special identifier, return from
builtin function, ??) that is used to receive values into the
generator. I'll call this simply receive here for now.
The presence of receive is what marks the generator as a new-style
generator. Receive looks like an iterator and can be used and passed
around as an iterator to anything requiring an iterator within the
generator. It does not return values to the caller like yield
expressions do.
Commensurate with receive, the send method is changed in new-style
generators. It still provides a value to the generator, but no longer
returns anything. This covers the use case where you want to interact
with the generator, as you've indicated above. Thus, a new style
generator would work just like you show in your first example, which is
more intuitive than the current definition of send as passing a value
both directions. So there would not be a need to change the continue
statement.
>
> As to chaining generators, I don't see that as a for loop-specific
> feature. If that's useful in a for, then it's useful outside and
> should stand on its own.
Agreed.
Also, a new method, called using is added to new-style generators to
provide an iterator to be used as its receive object. This takes an
iterator, attaches it to the generator, and returns the generator so
that you can do for i in gen(x).using(iterable). This covers the use
case where you have all of the input values ahead of time. And then, as
a little extra syntactic sugar, the | operator would be overloaded on
generators and iterators to call this using method:
class iterator:
...
def __or__(self, gen_b):
return gen_b.using(self)
Thus, when chaining generators together, you can use either:
for gen1(x) | gen2(y) | gen3(z) as i:
or
for gen3(z).using(gen2(y).using(gen1(x))) as i:
This also introduces a "new-style" for statement that properly honors
the generator interface (calls close and throw like you'd expect) vs the
"old-style" for statement that doesn't. The reason for the different
syntax is that there may be code out there that uses a generator in a
for statement with a break in it and then wants to continue with the
generator in a subsequent for statement:
g = gen(x)
for i in g: # doesn't close g
...
if cond:
break
for i in g: # process the rest of g's elements
...
This could be done with the new-style for statement as:
g = gen(x)
for somelib.notclosing(g) as i:
...
if cond:
break
for g as i:
...
Comments? Should this be part of the yield from PEP?
-bruce frederiksen
More information about the Python-ideas
mailing list