
I believe that asyncio should have a way to wait for input from a different process without blocking the event loop. The Asyncio module currently contains a Queue class that allows communication between multiple coroutines running on the same event loop. However, this module is not threadsafe or process-safe. The multiprocessing module contains Queue and Pipe classes that allow inter-process communication, but there's no way to directly read from these objects without blocking the event loop. I propose adding a Pipe class to asyncio, that is process-safe and can be read from without blocking the event loop. This was discussed a bit here: https://github.com/python/cpython/pull/20882#issuecomment-683463367 This could be implemented using the multiprocessing.Pipe class. multiprocessing.connection.Connection.fileno() returns the file descriptor used by a pipe. We could then use loop.add_reader() to set an asyncio.Event when something has been written to the pipe by the other process. I did this all manually in a project I was working on. However, this required me to learn a considerable amount about asyncio. It would have saved me a lot of time if there was an easy documented way to wait for input from another process in a non-blocking way. One compelling use case for this is a server that uses asyncio, which receives inputs from clients, then sends these to another process that runs a neural network. The server then sends the client a result after the neural network finishes. ProcessPoolExecutor does not seem like a good fit for this use case, because the process needs to stay alive and be re-used for subsequent requests. Starting a new process for each request is impractical, because loading the neural network into GPU memory is an expensive operation. See here for an example of such a server (however this one is mostly written in C++ and does not asyncio): https://www.tensorflow.org/tfx/guide/serving

The asyncio module already has a subprocess support: Subprocesses — Python 3.9.1 documentation <https://docs.python.org/3/library/asyncio-subprocess.html> Was that not sufficient to solve your problem? On Mon, Dec 28, 2020 at 5:23 AM Roger Iyengar <raiyenga@cs.cmu.edu> wrote:
-- --Guido van Rossum (python.org/~guido) *Pronouns: he/him **(why is my pronoun here?)* <http://feministing.com/2015/02/03/how-using-they-as-a-singular-pronoun-can-c...>

It was not sufficient. The only way to communicate with a Subprocesses is using stdout, stdin and stderr. However, packages like Tensroflow will print messages to stdout, and this can be hard to turn off. It seems useful to have a class like multiprocessing.Pipe to communicate with another process, separately from stdout/stdin. On Mon, Dec 28, 2020 at 12:50 PM Guido van Rossum <guido@python.org> wrote:

Okay, fair. I am guessing that the first step would be to create a quality implementation and publish it on PyPI. And of course this begs the question, *who* is going to do the work? [ducks] On Mon, Dec 28, 2020 at 10:27 AM Roger Iyengar <raiyenga@cs.cmu.edu> wrote:
-- --Guido van Rossum (python.org/~guido) *Pronouns: he/him **(why is my pronoun here?)* <http://feministing.com/2015/02/03/how-using-they-as-a-singular-pronoun-can-c...>

I created an implementation on PyPI here: https://pypi.org/project/asyncio-pipe/ I am using the same function signatures that multiprocessing.Connection does. I use composition, and have changed recv(), poll, recv_bytes(), and recv_bytes_into(buffer) so that they will not block the event loop. On Mon, Dec 28, 2020 at 1:45 PM Guido van Rossum <guido@python.org> wrote:

The asyncio module already has a subprocess support: Subprocesses — Python 3.9.1 documentation <https://docs.python.org/3/library/asyncio-subprocess.html> Was that not sufficient to solve your problem? On Mon, Dec 28, 2020 at 5:23 AM Roger Iyengar <raiyenga@cs.cmu.edu> wrote:
-- --Guido van Rossum (python.org/~guido) *Pronouns: he/him **(why is my pronoun here?)* <http://feministing.com/2015/02/03/how-using-they-as-a-singular-pronoun-can-c...>

It was not sufficient. The only way to communicate with a Subprocesses is using stdout, stdin and stderr. However, packages like Tensroflow will print messages to stdout, and this can be hard to turn off. It seems useful to have a class like multiprocessing.Pipe to communicate with another process, separately from stdout/stdin. On Mon, Dec 28, 2020 at 12:50 PM Guido van Rossum <guido@python.org> wrote:

Okay, fair. I am guessing that the first step would be to create a quality implementation and publish it on PyPI. And of course this begs the question, *who* is going to do the work? [ducks] On Mon, Dec 28, 2020 at 10:27 AM Roger Iyengar <raiyenga@cs.cmu.edu> wrote:
-- --Guido van Rossum (python.org/~guido) *Pronouns: he/him **(why is my pronoun here?)* <http://feministing.com/2015/02/03/how-using-they-as-a-singular-pronoun-can-c...>

I created an implementation on PyPI here: https://pypi.org/project/asyncio-pipe/ I am using the same function signatures that multiprocessing.Connection does. I use composition, and have changed recv(), poll, recv_bytes(), and recv_bytes_into(buffer) so that they will not block the event loop. On Mon, Dec 28, 2020 at 1:45 PM Guido van Rossum <guido@python.org> wrote:
participants (2)
-
Guido van Rossum
-
Roger Iyengar