[Python-ideas] yield-from and @coroutine decorator [was:x=(yield from) confusion]
Nick Coghlan
ncoghlan at gmail.com
Fri Apr 3 18:08:13 CEST 2009
Jacob Holm wrote:
>> - the particular colour my bikeshed is painted when it comes to
>> returning values from a generator is "return finally" (the idea being to
>> emphasise that this represents a special "final" value for the generator
>> that happens only after all of the normal yields are done).
>>
>
> We should probably drop that particular bikeshed discussion until we
> actually know the details of what the construct should do, esp in the
> context of close(). I am starting to lose track of all the different
> possible versions.
Note that the syntax for returning values from generators is largely
independent of the semantics. Guido has pointed out that disallowing the
naive "return EXPR" in generators is an important learning tool for
inexperienced generator users, and I think he's right.
"return finally" reads pretty well and doesn't add a new keyword, while
still allowing generator return values to be written easily. I haven't
seen other suggestions I particularly like, so I figured I'd run with
that one for the revised example :)
>> - rather than trying to change the meaning of GeneratorExit and close(),
>> 3 new generator methods would be added: next_return(), send_return() and
>> throw_return(). The new methods have the same signatures as their
>> existing counterparts, but if the generator raises GeneratorReturn, they
>> trap it and return the associated value instead. Like close(), they
>> complain with a RuntimeError if the generator doesn't finish. For
>> example:
>>
>> def throw_return(self, *exc_info):
>> try:
>> self.throw(*exc_info)
>> raise RuntimeError("Generator did not terminate")
>> except GeneratorReturn as gr:
>> return gr.value
>>
>
> I don't much like the idea of adding these methods, but that is not the
> point of this mail.
They don't have to be generator methods - they could easily be functions
in a coroutine module. However, I definitely prefer the idea of new
methods or functions that support a variety of interaction styles over
trying to redefine generator finalisation tools (i.e. GeneratorExit and
close()) to cover this completely different use case. Why create a
potential backwards compatibility problem for ourselves when there are
equally clean alternative solutions?
I also don't like the idea of imposing a specific coroutine return idiom
in the PEP - better to have a system that supports both sentinel values
(via next_return() and send_return()) and sentinel exceptions (via
send_throw()).
> Now for my problem. The original averager example was inspired by the
> tutorial http://dabeaz.com/coroutines/ that Guido pointed to. (Great
> stuff, btw). One pattern that is recommended by the tutorial and used
> throughout is to decorate all coroutines with a decorator like:
>
> def coroutine(func):
> def start(*args,**kwargs):
> cr = func(*args,**kwargs)
> cr.next()
> return cr
> return start
>
>
> The idea is that it saves you from the initial next() call used to start
> the coroutine. The problem is that you cannot use such a decorated
> coroutine in any flavor of the yield-from expression we have considered
> so far, because the yield-from will start out by doing an *additional*
> next call and yield that value.
>
> I have a few vague ideas of how we might change "yield from" to support
> this, but nothing concrete enough to put here. Is this a problem we
> should try to fix, and if so, how?
Hmm, that's a tricky one. It sounds like it is definitely an issue the
PEP needs to discuss, but I don't currently have an opinion as to what
it should say.
> not-trying-to-be-difficult-ly yours
We have a long way to go before we even come close to consuming as many
pixels as PEP 308 or PEP 343 - a fact for which Greg is probably grateful ;)
Cheers,
Nick.
--
Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia
---------------------------------------------------------------
More information about the Python-ideas
mailing list