[Python-ideas] Are there asynchronous generators?

Andrew Svetlov andrew.svetlov at gmail.com
Wed Jun 24 12:13:53 CEST 2015


Your idea is clean and maybe we will allow `yield` inside `async def`
in Python 3.6.
For PEP 492 it was too big change.

On Wed, Jun 24, 2015 at 12:00 PM, Adam Bartoš <drekin at gmail.com> wrote:
> Hello,
>
> I had a generator producing pairs of values and wanted to feed all the first
> members of the pairs to one consumer and all the second members to another
> consumer. For example:
>
> def pairs():
>     for i in range(4):
>         yield (i, i ** 2)
>
> biconsumer(sum, list)(pairs()) -> (6, [0, 1, 4, 9])
>
> The point is I wanted the consumers to be suspended and resumed in a
> coordinated manner: The first producer is invoked, it wants the first
> element. The coordinator implemented by biconsumer function invokes pairs(),
> gets the first pair and yields its first member to the first consumer. Then
> it wants the next element, but now it's the second consumer's turn, so the
> first consumer is suspended and the second consumer is invoked and fed with
> the second member of the first pair. Then the second producer wants the next
> element, but it's the first consumer's turn… and so on. In the end, when the
> stream of pairs is exhausted, StopIteration is thrown to both consumers and
> their results are combined.
>
> The cooperative asynchronous nature of the execution reminded me asyncio and
> coroutines, so I thought that biconsumer may be implemented using them.
> However, it seems that it is imposible to write an "asynchronous generator"
> since the "yielding pipe" is already used for the communication with the
> scheduler. And even if it was possible to make an asynchronous generator, it
> is not clear how to feed it to a synchronous consumer like sum() or list()
> function.
>
> With PEP 492 the concepts of generators and coroutines were separated, so
> asyncronous generators may be possible in theory. An ordinary function has
> just the returning pipe – for returning the result to the caller. A
> generator has also a yielding pipe – used for yielding the values during
> iteration, and its return pipe is used to finish the iteration. A native
> coroutine has a returning pipe – to return the result to a caller just like
> an ordinary function, and also an async pipe – used for communication with a
> scheduler and execution suspension. An asynchronous generator would just
> have both yieling pipe and async pipe.
>
> So my question is: was the code like the following considered? Does it make
> sense? Or are there not enough uses cases for such code? I found only a
> short mention in
> https://www.python.org/dev/peps/pep-0492/#coroutine-generators, so possibly
> these coroutine-generators are the same idea.
>
> async def f():
>     number_string = await fetch_data()
>     for n in number_string.split():
>         yield int(n)
>
> async def g():
>     result = async/await? sum(f())
>     return result
>
> async def h():
>     the_sum = await g()
>
> As for explanation about the execution of h() by an event loop: h is a
> native coroutine called by the event loop, having both returning pipe and
> async pipe. The returning pipe leads to the end of the task, the async pipe
> is used for cummunication with the scheduler. Then, g() is called
> asynchronously – using the await keyword means the the access to the async
> pipe is given to the callee. Then g() invokes the asyncronous generator f()
> and gives it the access to its async pipe, so when f() is yielding values to
> sum, it can also yield a future to the scheduler via the async pipe and
> suspend the whole task.
>
> Regards, Adam Bartoš
>
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/



-- 
Thanks,
Andrew Svetlov


More information about the Python-ideas mailing list