Anything better than asyncio.as_completed() and asyncio.wait() to manage execution of large amount of tasks?

Maxime Steisel maximesteisel at
Thu Jul 17 01:09:02 CEST 2014

2014-07-15 14:20 GMT+02:00 Valery Khamenya <khamenya at>:
> Hi,
> both asyncio.as_completed() and asyncio.wait() work with lists only. No
> generators are accepted. Are there anything similar to those functions that
> pulls Tasks/Futures/coroutines one-by-one and processes them in a limited
> task pool?

Something like this (adapted from as_completed) should do the work:

import asyncio
from concurrent import futures

def parallelize(tasks, *, loop=None, max_workers=5, timeout=None):
    loop = loop if loop is not None else asyncio.get_event_loop()
    workers = []
    pending = set()
    done = asyncio.Queue(maxsize=max_workers)
    exhausted = False

    def _worker():
        nonlocal exhausted
        while not exhausted:
                t = next(tasks)
                yield from t
                yield from done.put(t)
            except StopIteration:
                exhausted = True

    def _on_timeout():
        for f in workers:
        #Wake up _wait_for_one()

    def _wait_for_one():
        f = yield from done.get()
        if f is None:
            raise futures.TimeoutError()
        return f.result()

    workers = [asyncio.async(_worker()) for i in range(max_workers)]

    if workers and timeout is not None:
        timeout_handle = loop.call_later(timeout, _on_timeout)

    while not exhausted or pending or not done.empty():
        yield _wait_for_one()


More information about the Python-list mailing list