[Python-ideas] yield from multiple iterables (was Re: The async API of the future: yield-from)

Terry Reedy tjreedy at udel.edu
Sat Oct 20 03:55:22 CEST 2012


On 10/19/2012 5:31 PM, Guido van Rossum wrote:

> I did a basic timing test using a simple recursive function and a
> recursive PEP-380 coroutine computing the same value (see attachment).
> The coroutine version is a little over twice as slow as the function
> version. I find that acceptable. This went 20 deep, making 2 recursive
> calls at each level (except at the deepest level).
>
> Output on my MacBook Pro:
>
> plain 2097151 0.5880069732666016
> coro. 2097151 1.2958409786224365
>
> This was a Python 3.3 built a few days ago from the 3.3 branch.

At the top level, the coroutine version adds 2097151 next() calls. 
Suspecting that that, not the addition of 'yield from' was responsible 
for most of the extra time, I added

def trivial():
     for i in range(2097151):
         yield
     raise StopIteration(2097151)
...
     t0 = time.time()
     try:
         g = trivial()
         while True:
             next(g)
     except StopIteration as err:
         k = err.value
     t1 = time.time()
     print('triv.', k, t1-t0)


The result supports the hypothesis.

plain 2097151 0.4590260982513428
coro. 2097151 0.9180529117584229
triv. 2097151 0.39902305603027344

I don't know what to make of this in the context of asynch operations, 
but in 'traditional' use, the generator would not replace a function 
returning a single number but one returning a list (of, in this case, 
2097151 numbers), so each next replaces a .append method call.

-- 
Terry Jan Reedy




More information about the Python-ideas mailing list