[Python-ideas] PEP on yield-from: throw example
dangyogi at gmail.com
Wed Feb 18 18:38:21 CET 2009
Greg Ewing wrote:
> George Sakkis wrote:
>> For throw() however, I strongly disagree that
>> a raise statement in a loop should implicitly call generator.throw(),
>> regardless of what "for" syntax is used.
> Just in case it's not clear, the behaviour being suggested
> here is *not* part of my proposal. As far as yield-from is
> concerned, propagation of exceptions into the subgenerator
> would only occur when throw() was called on the generator
> containing the yield-from, and then only when it's suspended
> in the midst of it. Raise statements within the delegating
> generator have nothing to do with the matter and aren't
> affected at all.
> Having some examples to look at is a good idea, but
> Bruce seems to be going off on a tangent and making some
> proposals of his own for enhancing the for-loop. I fear
> that this will only confuse the discussion further.
> Perhaps I should also point out that yield-from is *not*
> intended to help things like itertools.chain manage the
> cleanup of its generators, so examples involving things
> with chain-like behaviour are probably not going to help
> clarify what it *is* intended for.
> It would be nice to have a language feature to help with
> things like that, but I have no idea at the moment what
> such a thing would be like.
I apologize for any side-tracking of the yield from discussion. As
people are asking for real world examples and I've done a lot with
generators, and I didn't see many other people offering examples, I
thought I could offer some. But my code obviously doesn't use yield
from, so I'm looking to use of the for statement or itertools.chain,
which are the two that would be replaced by yield from. So I'm
thinking, on the one hand, that examples where for or chain should
forward send/throw/close should transfer to yield from. But I'm also
thinking that the same arguments apply to for/chain.
OTOH, the desire to use "yield from" for a "poor man's" cooperative
threading facility also brings me to think that generators have 3 fatal
design flaws that will prevent them from growing into something much
more useful (like threading):
1. The double use of send/throw and the yield expression for
simultaneous input and output to/from the generator; rather than
separating input and output as two different constructs. Sending
one value in does not always correspond to getting one value out.
2. The absence of an object (even an implicit one like sys.stdin and
sys.stdout are for input and print) representing the target of the
yield/throw/send that can be passed on to other functions,
allowing them to contribute to the generator's output stream in a
much more natural way.
* I'm thinking here of a pair of cooperating pipe objects,
read and write, and a pair of built-in functions, something
like input and print that get and send an object to implicit
pipein and pipeout objects (one for each "thread"). These
would replace send and yield.
* But I think that the iterator interface is very successful,
should be kept intact, and is what the read pipe object
should look like.
3. The double use of yield to indicate rendezvoused output to the
parent "thread", as well as to flag its containing function as one
that always starts a new "thread" when executed.
* This prevents us from having generator A simply call
generator B to have B yield objects for A. In other words,
calling B as a normal function that doesn't start another
thread would mean that B yields to the current thread's
pipeout. While starting B in a new thread with its own
pipeout would do what current generators do. Thus generator
A would have the option to run B in two ways, as a new
generator thread to yield values back to A, or within A's
thread as a normal function to yield values to the same
place that A yields values to.
* I'm thinking that there would be a builtin generate function
(or some special syntax) used to run a function in a new
thread. Thus generate(gen_b, arg1, arg2, ...) would return
a read pipe (which is an iterable) connected to the write
pipe for the new thread:
for x in generate(gen_b, arg1, arg2, ...):
for x in gen_b(arg1, arg2, ...)&:
or whatever, is different than:
gen_b(arg1, arg2, ...)
This would accomplish what yield from is trying to do in a more
flexible and readable way.
So the question in my mind is: do we move towards adopting some new kind
of generator/threading capability (and eventually deprecating current
generators) that doesn't have these limitations, or do we stick with
If we want to stick with the current generators, then I'm in favor of
the proposed "yield from" (with the possible exception of the new "return").
But even if we want to more towards a new-style generator capability,
"yield from" could be fielded much more quickly than a whole new-style
generator capability, so ???
If people are interested in discussing this further, I'm open to that.
Otherwise, sorry for the side-tracking...
More information about the Python-ideas