[Python-ideas] yield-from and @coroutine decorator [was:x=(yield from) confusion]

Jacob Holm jh at improva.dk
Fri Apr 3 17:15:16 CEST 2009

Hi Nick,

Your reworking of my "averager" example has highlighted another issue 
for me, which I will get to below. First a few comments on your message.

Nick Coghlan wrote:
> [snip]
> The following reworking of Jacob's example assumes a couple of things
> that differ from the current PEP:
> - the particular colour my bikeshed is painted when it comes to
> returning values from a generator is "return finally" (the idea being to
> emphasise that this represents a special "final" value for the generator
> that happens only after all of the normal yields are done).

We should probably drop that particular bikeshed discussion until we 
actually know the details of what the construct should do, esp in the 
context of close(). I am starting to lose track of all the different 
possible versions.

> - rather than trying to change the meaning of GeneratorExit and close(),
> 3 new generator methods would be added: next_return(), send_return() and
> throw_return(). The new methods have the same signatures as their
> existing counterparts, but if the generator raises GeneratorReturn, they
> trap it and return the associated value instead. Like close(), they
> complain with a RuntimeError if the generator doesn't finish. For example:
>   def throw_return(self, *exc_info):
>     try:
>       self.throw(*exc_info)
>       raise RuntimeError("Generator did not terminate")
>     except GeneratorReturn as gr:
>       return gr.value

I don't much like the idea of adding these methods, but that is not the 
point of this mail.

> (Note that I've also removed the 'yield raise' idea from the example -
> if next() or send() triggers termination of the generator with an
> exception other than StopIteration, then that exception is already
> propagated into the calling scope by the existing generator machinery. I
> realise Jacob was trying to make it possible to "yield an exception"
> without terminating the coroutine, but that idea is well beyond the
> scope of the PEP)

I think it was pretty clearly marked as out of scope for this PEP, but I 
still like the idea.

> You then get:
>   class CalcAverage(Exception): pass
>   def averager(start=0):
>     # averager that maintains a running average
>     # and returns the final average when done
>     count = 0
>     exc = None
>     sum = start
>     while 1:
>       avg = sum / count
>       try:
>         val = yield avg
>       except CalcAverage:
>         return finally avg
>       sum += val
>       count += 1
>   avg = averager()
>   avg.next() # start coroutine
>   avg.send(1.0) # yields 1.0
>   avg.send(2.0) # yields 1.5
>   print avg.throw_return(CalcAverage) # prints 1.5

This version has a bug. It will raise ZeroDivisionError on the initial 
next() call used to start the generator. A better version if you insist 
on yielding the running average, would be:

def averager(start=0):
    # averager that maintains a running average
    # and returns the final average when done
    count = 0
    sum = start
    avg = None
    while 1:
            val = yield avg
        except CalcAverage:
            return finally avg
        sum += val
        count += 1
        avg = sum/count

> Now, suppose I want to write another toy coroutine that calculates the
> averages of two sequences and then returns the difference:
>   def average_diff(start=0):
>     avg1 = yield from averager(start)
>     avg2 = yield from averager(start)
>     return finally avg2 - avg1
>   diff = average_diff()
>   diff.next() # start coroutine
>               # yields 0.0
>   avg.send(1.0) # yields 1.0
>   avg.send(2.0) # yields 1.5
>   diff.throw(CalcAverage) # Starts calculation of second average
>                           # yields 0.0
>   diff.send(2.0) # yields 2.0
>   diff.send(3.0) # yields 2.5
>   print diff.throw_return(CalcAverage) # Prints 1.0 (from "2.5 - 1.5")
(There is another minor bug here: the two avg.send() calls should have 
been diff.send()).

Now for my problem. The original averager example was inspired by the 
tutorial http://dabeaz.com/coroutines/ that Guido pointed to. (Great 
stuff, btw). One pattern that is recommended by the tutorial and used 
throughout is to decorate all coroutines with a decorator like:

def coroutine(func):
    def start(*args,**kwargs):
        cr = func(*args,**kwargs)
        return cr
    return start

The idea is that it saves you from the initial next() call used to start 
the coroutine. The problem is that you cannot use such a decorated 
coroutine in any flavor of the yield-from expression we have considered 
so far, because the yield-from will start out by doing an *additional* 
next call and yield that value.

I have a few vague ideas of how we might change "yield from" to support 
this, but nothing concrete enough to put here. Is this a problem we 
should try to fix, and if so, how?

not-trying-to-be-difficult-ly yours
- Jacob

More information about the Python-ideas mailing list