[Python-ideas] Revised**5 PEP on yield-from

Jim Jewett jimjjewett at gmail.com
Mon Mar 2 04:34:48 CET 2009

On 3/1/09, Jacob Holm <jh at improva.dk> wrote:
> Greg Ewing wrote:
>> Jacob Holm wrote:

[naming that for which R waits]
>>> A
>>>  \
>>>    R --- X (=whatever R is waiting for)
>>>  /
>>> B

>> But in any case, I think you can still model this as two separate
>> stacks, with R appearing in both stacks: [A, R] and [B, R].

[A, R, X] and [B, R, X]

>> Whichever
>> one of them finishes yielding from R first pops it from its stack,
>> and when the other one tries to resume R it gets an exception. Either
>> that or it breaks out of its yield-from early and discards its
>> version of R.
> I am not worried about R running out, each of A and B would find out
> about that next time they tried to get a value. I *am* worried about R
> doing a yield-from to X (the xrange in this example) which then needs to
> appear in both stacks to get the expected behavior from the PEP.

I would assume that the second one tries to resume, gets the
StopIteration from X, retreats to R, gets the StopIteration there as
well, and then continues after its yield-from.

If that didn't happen, I would wonder whether (theoretical speed)
optimization was leading to suboptimal semantics.

>> > As long as that scenario is possible I can construct
>>> an example where treating it as a simple stack will either do the
>>> wrong thing, or do the right thing but slower than a standard "for v
>>> in it: yield v".

I have to wonder whether any optimization will be a mistake.  At the
moment, I can't think of any way to do it without adding at least an
extra pointer and an extra if-test.  That isn't much, but ... how
often will there be long chains, vs how often are generators used
without getting any benefit from this sort of delegation?


More information about the Python-ideas mailing list