[Python-ideas] Revised**5 PEP on yield-from
jh at improva.dk
Mon Mar 2 03:11:04 CET 2009
Greg Ewing wrote:
> Jacob Holm wrote:
>> After doing
>> "yield from R" in generator A, you can go ahead and do "yield from R"
>> in generator B as well. If R is also using "yield from" you will have
>> the following situation:
>> R --- (whatever R is waiting for)
> Unless you're doing something extremely unusual, this situation
> wouldn't arise. The yield-from statement is intended to run the
> iterator to exhaustion, and normally you'd create a fresh iterator
> for each yield-from that you want to do. So A and B would really
> be yielding from different iterators, even if they were both
> iterating over the same underlying object.
Unusual, yes. Extremely? I'm not sure. If you/we allow this, someone
will find a use for it.
> If you did try to share iterators between yield-froms like that,
> you would have to arrange things so that at least all but one
> of them broke out of the loop early, otherwise something is
> going to get an exception due to trying to resume an exhausted
Did you read the spelled-out version at the bottom? No need to "break
out" of anything. That happens automatically because of the "yield
from". Just a few well-placed calls to next...
> But in any case, I think you can still model this as two separate
> stacks, with R appearing in both stacks: [A, R] and [B, R]. Whichever
> one of them finishes yielding from R first pops it from its stack,
> and when the other one tries to resume R it gets an exception. Either
> that or it breaks out of its yield-from early and discards its
> version of R.
I am not worried about R running out, each of A and B would find out
about that next time they tried to get a value. I *am* worried about R
doing a yield-from to X (the xrange in this example) which then needs to
appear in both stacks to get the expected behavior from the PEP.
> > As long as that scenario is possible I can construct
>> an example where treating it as a simple stack will either do the
>> wrong thing, or do the right thing but slower than a standard "for v
>> in it: yield v".
> That depends on what you think the "right thing" is. If you
> think that somehow A needs to notice that B has finished yielding
> from R and gracefully stop doing so itself, then that's not something
> I intended and it's not the way the current prototype implementation
> would behave.
The right thing is whatever the PEP specifies :) You are the author, so
you get to decide...
I am saying that what the PEP currently specifies is not quite so simple
to speed up as you and Arnaud seem to think.
(Even with a simple stack, handling 'close' and 'StopIteration'
correctly is not exactly trivial)
> So IMO you're worrying about a problem that doesn't exist.
No, I am worrying about a problem that so far has only appeared in
contrived examples designed to expose it. Any real-life examples I have
seen of the "yield from" feature would work perfectly well with a simple
stack-based approach. However, I have seen several ideas for speeding
up long chains of "yield from"s beyond the current C implementation, and
most of them fail either by giving wrong results (bad) or by slowing
things down in admittedly unusual cases (not so bad, but not good).
Anyway... it is 3 in the morning. As I told Arnaud, I will try to find
some time this week to write some more of these crazy examples.
More information about the Python-ideas