[Python-ideas] async/await in Python

Andrew Barnert abarnert at yahoo.com
Sat Apr 18 00:26:52 CEST 2015


On Apr 17, 2015, at 15:18, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> 
> Hi Chris,
> 
> Thanks for the feedback; answers below:
> 
>> On 2015-04-17 6:00 PM, Chris Angelico wrote:
>>> On Sat, Apr 18, 2015 at 4:58 AM, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>>> The following new syntax is used to declare a coroutine::
>>> 
>>>     async def read_data(db):
>>>         pass
>>> 
>>> Key properties of coroutines:
>>> 
>>> * Coroutines are always generators, even if they do not contain ``await``
>>>   expressions.
>> Do you mean that a coroutine is a special form of generator, or that a
>> function declared with "async" is always a coroutine even if it
>> doesn't contain an "await"? You're separating coroutines from
>> generators in many ways, so maybe I'm just misreading this.
> 
> Good catch; I've committed the fix.
> 
> Yes, ``async def`` functions are always coroutines no matter if
> they have ``await`` exprs in them or not.
>> 
>>> Asynchronous Context Managers and "async with"
>>> ----------------------------------------------
>>> 
>>> An *asynchronous context manager* is a context manager that is able to
>>> suspend
>>> execution in its *enter* and *exit* methods.
>> A regular 'with' block guarantees that subsequent code won't execute
>> until __exit__ completes. I presume the same guarantee is applied to
>> an 'async with' block, and thus it can be used only inside a
>> coroutine?
> Correct.
>> 
>> (Edit: Oops, missed where that's said further down - SyntaxError to
>> use it outside a coroutine. Might still be worth clarifying the
>> guarantee somewhere, otherwise ignore me.)
>> 
>>> New Syntax
>>> ''''''''''
>>> 
>>> A new statement for iterating through asynchronous iterators is proposed::
>>> 
>>>     async for TARGET in ITER:
>>>         BLOCK
>>>     else:
>>>         BLOCK2
>>> 
>>> which is semantically equivalent to::
>>> 
>>>     iter = (ITER)
>>>     iter = await type(iter).__aiter__(iter)
>>>     running = True
>>>     while running:
>>>         try:
>>>             TARGET = await type(iter).__anext__(iter)
>>>         except StopAsyncIteration:
>>>             running = False
>>>         else:
>>>             BLOCK
>>>     else:
>>>         BLOCK2
>> (Not sure why you don't just use "break" instead of "running = False"?
>> Maybe I'm blind to some distinction here.)
> 
> Because if you use break, the "else: BLOCK2" won't execute, but
> it should, as StopAsyncIteration is not a regular exception, but
> something to signal that the iteration is over.
>> 
>>> Why StopAsyncIteration?
>>> '''''''''''''''''''''''
>>> 
>>> Coroutines are still based on generators internally.  So, before PEP 479,
>>> there
>>> was no fundamental difference between
>>> 
>>> ::
>>> 
>>>     def g1():
>>>         yield from fut
>>>         return 'spam'
>>> 
>>> and
>>> 
>>> ::
>>> 
>>>     def g2():
>>>         yield from fut
>>>         raise StopIteration('spam')
>>> 
>>> And since PEP 479 is accepted and enabled by default for coroutines, the
>>> following example will have its ``StopIteration`` wrapped into a
>>> ``RuntimeError``
>>> 
>>> ::
>>> 
>>>     async def a1():
>>>         await fut
>>>         raise StopIteration('spam')
>>> 
>>> The only way to tell the outside code that the iteration has ended is to
>>> raise
>>> something other than ``StopIteration``.  Therefore, a new built-in exception
>>> class ``StopAsyncIteration`` was added.
>>> 
>>> Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions
>>> raised
>>> in coroutines are wrapped in ``RuntimeError``.
>> Can't a coroutine simply use 'return'? I'm not hugely familiar with
>> them, but I'm not understanding here why you need a completely
>> separate exception - and, more importantly, why you wouldn't have the
>> same consideration of "StopAsyncIteration leakage". It ought to be
>> possible for the coroutine to return, StopIteration be synthesized,
>> and the __anext__ function to bubble that out - exactly the way it
>> does for generators today.
> 
> Unfortunately, there is no way we can enable StopIteration to do
> the same thing that StopAsyncIteration does.
> 
> 'return' in coroutines is implemented with 'StopIteration' exception
> internally, so there is no way (without ugly hacks) to distinguish
> before plain returns and desire to stop the iteration.
>> 
>>> Glossary
>>> ========
>>> 
>>> :Coroutine:
>>>     A coroutine function, or just "coroutine", is declared with ``async
>>> def``.
>>>     It uses ``await`` and ``return value``; see `New Coroutine Declaration
>>>     Syntax`_ for details.
>> (Using this definition to justify my statement above)
>> 
>>> Why not a __future__ import
>>> ---------------------------
>>> 
>>> ``__future__`` imports are inconvenient and easy to forget to add. Also,
>>> they
>>> are enabled for the whole source file.  Consider that there is a big project
>>> with a popular module named "async.py".  With future imports it is required
>>> to
>>> either import it using ``__import__()`` or ``importlib.import_module()``
>>> calls,
>>> or to rename the module.  The proposed approach makes it possible to
>>> continue
>>> using old code and modules without a hassle, while coming up with a
>>> migration
>>> plan for future python versions.
>> The clash would occur only in modules that use the __future__ import,
>> wouldn't it? And those modules are going to have to be fixed by
>> version X.Y anyway - the whole point of the __future__ import is to
>> detect that.
> You're right.
> 
> Let's see what python-ideas thinks about it.  I'm fine if everybody
> wants __future__ imports.  I will only have to rollback my changes
> in tokenizer.c and change few tokens in Grammar to make the
> reference implementation work.
>>> Why magic methods start with "a"
>>> --------------------------------
>>> 
>>> New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``,
>>> and ``__aexit__`` all start with the same prefix "a".  An alternative
>>> proposal
>>> is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``.
>>> However, to align new magic methods with the existing ones, such as
>>> ``__radd__`` and ``__iadd__`` it was decided to use a shorter version.
>> I support the short names, with the possible exception that
>> "__await__" now looks like "asynchronous wait", which may or may not
>> be desirable.
>> 
>>> Comprehensions
>>> --------------
>>> 
>>> For the sake of restricting the broadness of this PEP there is no new syntax
>>> for asynchronous comprehensions.  This should be considered in a separate
>>> PEP,
>>> if there is a strong demand for this feature.
>> Likewise asynchronous lambda functions (or maybe I just missed it).
> Right. Right now I'm just not sure they would be useful at all,
> whereas I can imagine some usecases for comprehensions ;)
> 
>> 
>> 
>> Overall comment: The changes appear to be infecting a lot of code.
>> Will this suddenly be something that everyone has to think about, or
>> can it still be ignored in projects that don't use it?
> 
> If you're not writing some asynchronous code or using some
> asynchronous library that takes advantage of coroutines, you
> can completely ignore the new syntax and protocols.

Is that true?

If I'm writing a general-purpose library, and it might be useful in someone else's async code, and I write a context manager that does the usual "close/destroy resources on __exit__ thing", shouldn't I really think about whether it should also be an async context manager?

That raises another question: is there a reasonable parallel to the contextlib.contextmanager decorator for building simple async context managers? (Or, for that matter, for building simple context-manager-and-also-async-context-managers?)

> 
>> There'll be a
>> few object types that would benefit from becoming async context
>> managers as well as regular context managers (I would expect, for
>> instance, that the built-in file type would allow its auto-closing to
>> be asynchronous), but if a third-party module completely ignores this
>> proposal, everything should continue to work, right?
> Exactly. That's why we propose to have separate protocols
> __a*__ to completely eliminate possibility of screwing
> something up.
> 
>> 
>> I'm seeing the potential for a lot of code duplication here, but
>> possibly once things settle into real-world usage, a few easy
>> decorators will help out with that (like the total_ordering decorator
>> for handling __gt__, __le__, etc etc).
>> 
>> +1 on the proposal as a whole.
> Awesome!
> 
> Thanks a lot!
> 
> Yury
> 
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/


More information about the Python-ideas mailing list