Python-Dev
Threads by month
- ----- 2025 -----
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2005 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2004 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2003 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2002 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2001 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2000 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 1999 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
May 2015
- 112 participants
- 87 discussions
In the spirit of regular releases, it's time to release 2.7.10. I'm
going to plan to cut rc1 this weekend with a final in 2 weeks.
I apologize for the short notice; time has crept up on me, and I have
commitments in June that prevent pushing releases into that month.
1
0
ACTIVITY SUMMARY (2015-05-01 - 2015-05-08)
Python tracker at http://bugs.python.org/
To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.
Issues counts and deltas:
open 4838 ( -3)
closed 31069 (+44)
total 35907 (+41)
Open issues with patches: 2252
Issues opened (25)
==================
#24107: Add support for retrieving the certificate chain
http://bugs.python.org/issue24107 opened by Lukasa
#24109: Documentation for difflib …
[View More]uses optparse
http://bugs.python.org/issue24109 opened by idahogray
#24110: zipfile.ZipFile.write() does not accept bytes arcname
http://bugs.python.org/issue24110 opened by july
#24111: Valgrind suppression file should be updated
http://bugs.python.org/issue24111 opened by Antony.Lee
#24114: ctypes.utils uninitialized variable 'path'
http://bugs.python.org/issue24114 opened by kees
#24115: PyObject_IsInstance() and PyObject_IsSubclass() can fail
http://bugs.python.org/issue24115 opened by serhiy.storchaka
#24116: --with-pydebug has no effect when the final python binary is c
http://bugs.python.org/issue24116 opened by aleb
#24117: Wrong range checking in GB18030 decoder.
http://bugs.python.org/issue24117 opened by Ma Lin
#24119: Carry comments with the AST
http://bugs.python.org/issue24119 opened by brett.cannon
#24120: pathlib.(r)glob stops on PermissionDenied exception
http://bugs.python.org/issue24120 opened by Gregorio
#24124: Two versions of instructions for installing Python modules
http://bugs.python.org/issue24124 opened by skip.montanaro
#24126: newlines attribute does not get set after calling readline()
http://bugs.python.org/issue24126 opened by arekfu
#24127: Fatal error in launcher: Job information querying failed
http://bugs.python.org/issue24127 opened by gavstar
#24129: Incorrect (misleading) statement in the execution model docume
http://bugs.python.org/issue24129 opened by levkivskyi
#24130: Remove -fno-common compile option from OS X framework builds?
http://bugs.python.org/issue24130 opened by ned.deily
#24131: [configparser] Add section/option delimiter to ExtendedInterpo
http://bugs.python.org/issue24131 opened by giflw
#24132: Direct sub-classing of pathlib.Path
http://bugs.python.org/issue24132 opened by projetmbc
#24136: document PEP 448
http://bugs.python.org/issue24136 opened by benjamin.peterson
#24137: Force not using _default_root in IDLE
http://bugs.python.org/issue24137 opened by serhiy.storchaka
#24138: Speed up range() by caching and modifying long objects
http://bugs.python.org/issue24138 opened by larry
#24139: Use sqlite3 extended error codes
http://bugs.python.org/issue24139 opened by Dima.Tisnek
#24140: In pdb using "until X" doesn't seem to have effect in commands
http://bugs.python.org/issue24140 opened by vyktor
#24142: ConfigParser._read doesn't join multi-line values collected wh
http://bugs.python.org/issue24142 opened by fhoech
#24143: Makefile in tarball don't provide make uninstall target
http://bugs.python.org/issue24143 opened by krichter
#24145: Support |= for parameters in converters
http://bugs.python.org/issue24145 opened by larry
Most recent 15 issues with no replies (15)
==========================================
#24143: Makefile in tarball don't provide make uninstall target
http://bugs.python.org/issue24143
#24140: In pdb using "until X" doesn't seem to have effect in commands
http://bugs.python.org/issue24140
#24137: Force not using _default_root in IDLE
http://bugs.python.org/issue24137
#24136: document PEP 448
http://bugs.python.org/issue24136
#24131: [configparser] Add section/option delimiter to ExtendedInterpo
http://bugs.python.org/issue24131
#24129: Incorrect (misleading) statement in the execution model docume
http://bugs.python.org/issue24129
#24115: PyObject_IsInstance() and PyObject_IsSubclass() can fail
http://bugs.python.org/issue24115
#24114: ctypes.utils uninitialized variable 'path'
http://bugs.python.org/issue24114
#24111: Valgrind suppression file should be updated
http://bugs.python.org/issue24111
#24104: Use after free in xmlparser_setevents (2)
http://bugs.python.org/issue24104
#24103: Use after free in xmlparser_setevents (1)
http://bugs.python.org/issue24103
#24097: Use after free in PyObject_GetState
http://bugs.python.org/issue24097
#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087
#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084
#24063: Support Mageia and Arch Linux in the platform module
http://bugs.python.org/issue24063
Most recent 15 issues waiting for review (15)
=============================================
#24145: Support |= for parameters in converters
http://bugs.python.org/issue24145
#24142: ConfigParser._read doesn't join multi-line values collected wh
http://bugs.python.org/issue24142
#24138: Speed up range() by caching and modifying long objects
http://bugs.python.org/issue24138
#24130: Remove -fno-common compile option from OS X framework builds?
http://bugs.python.org/issue24130
#24117: Wrong range checking in GB18030 decoder.
http://bugs.python.org/issue24117
#24114: ctypes.utils uninitialized variable 'path'
http://bugs.python.org/issue24114
#24109: Documentation for difflib uses optparse
http://bugs.python.org/issue24109
#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102
#24091: Use after free in Element.extend (1)
http://bugs.python.org/issue24091
#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087
#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084
#24082: Obsolete note in argument parsing (c-api/arg.rst)
http://bugs.python.org/issue24082
#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076
#24068: statistics module - incorrect results with boolean input
http://bugs.python.org/issue24068
#24064: Make the property doctstring writeable
http://bugs.python.org/issue24064
Top 10 most discussed issues (10)
=================================
#22906: PEP 479: Change StopIteration handling inside generators
http://bugs.python.org/issue22906 15 msgs
#20179: Derby #10: Convert 50 sites to Argument Clinic across 4 files
http://bugs.python.org/issue20179 13 msgs
#24132: Direct sub-classing of pathlib.Path
http://bugs.python.org/issue24132 11 msgs
#24127: Fatal error in launcher: Job information querying failed
http://bugs.python.org/issue24127 9 msgs
#22881: show median in benchmark results
http://bugs.python.org/issue22881 7 msgs
#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102 6 msgs
#21800: Implement RFC 6855 (IMAP Support for UTF-8) in imaplib.
http://bugs.python.org/issue21800 5 msgs
#23888: Fixing fractional expiry time bug in cookiejar
http://bugs.python.org/issue23888 5 msgs
#24085: large memory overhead when pyc is recompiled
http://bugs.python.org/issue24085 5 msgs
#24117: Wrong range checking in GB18030 decoder.
http://bugs.python.org/issue24117 5 msgs
Issues closed (43)
==================
#2292: Missing *-unpacking generalizations
http://bugs.python.org/issue2292 closed by benjamin.peterson
#20148: Derby: Convert the _sre module to use Argument Clinic
http://bugs.python.org/issue20148 closed by serhiy.storchaka
#20168: Derby: Convert the _tkinter module to use Argument Clinic
http://bugs.python.org/issue20168 closed by serhiy.storchaka
#20274: sqlite module has bad argument parsing code, including undefin
http://bugs.python.org/issue20274 closed by larry
#21520: Erroneous zipfile test failure if the string 'bad' appears in
http://bugs.python.org/issue21520 closed by larry
#22334: test_tcl.test_split() fails on "x86 FreeBSD 7.2 3.x" buildbot
http://bugs.python.org/issue22334 closed by serhiy.storchaka
#23330: h2py.py regular expression missing
http://bugs.python.org/issue23330 closed by serhiy.storchaka
#23880: Tkinter: getint and getdouble should support Tcl_Obj
http://bugs.python.org/issue23880 closed by serhiy.storchaka
#23911: Move path-based bootstrap code to a separate frozen file.
http://bugs.python.org/issue23911 closed by eric.snow
#23920: Should Clinic have "nullable" or types=NoneType?
http://bugs.python.org/issue23920 closed by larry
#24000: More fixes for the Clinic mapping of converters to format unit
http://bugs.python.org/issue24000 closed by larry
#24001: Clinic: use raw types in types= set
http://bugs.python.org/issue24001 closed by larry
#24051: Argument Clinic no longer works with single optional argument
http://bugs.python.org/issue24051 closed by serhiy.storchaka
#24060: Clearify necessities for logging with timestamps
http://bugs.python.org/issue24060 closed by python-dev
#24066: send_message should take all the addresses in the To: header i
http://bugs.python.org/issue24066 closed by kirelagin
#24081: Obsolete caveat in reload() docs
http://bugs.python.org/issue24081 closed by r.david.murray
#24088: yield expression confusion
http://bugs.python.org/issue24088 closed by gvanrossum
#24089: argparse crashes with AssertionError
http://bugs.python.org/issue24089 closed by ned.deily
#24092: Use after free in Element.extend (2)
http://bugs.python.org/issue24092 closed by serhiy.storchaka
#24093: Use after free in Element.remove
http://bugs.python.org/issue24093 closed by serhiy.storchaka
#24094: Use after free during json encoding (PyType_IsSubtype)
http://bugs.python.org/issue24094 closed by python-dev
#24095: Use after free during json encoding a dict (2)
http://bugs.python.org/issue24095 closed by benjamin.peterson
#24096: Use after free in get_filter
http://bugs.python.org/issue24096 closed by python-dev
#24099: Use after free in siftdown (1)
http://bugs.python.org/issue24099 closed by rhettinger
#24100: Use after free in siftdown (2)
http://bugs.python.org/issue24100 closed by rhettinger
#24101: Use after free in siftup
http://bugs.python.org/issue24101 closed by rhettinger
#24105: Use after free during json encoding a dict (3)
http://bugs.python.org/issue24105 closed by benjamin.peterson
#24106: Messed up indentation makes undesired piece of code being run!
http://bugs.python.org/issue24106 closed by r.david.murray
#24108: fnmatch.translate('*.txt') fails
http://bugs.python.org/issue24108 closed by r.david.murray
#24112: %b does not work, as a binary output formatter
http://bugs.python.org/issue24112 closed by steven.daprano
#24113: shlex constructor unreachable code
http://bugs.python.org/issue24113 closed by rhettinger
#24118: http.client example is no longer valid
http://bugs.python.org/issue24118 closed by python-dev
#24121: collections page doesn't mention that deques are mutable
http://bugs.python.org/issue24121 closed by rhettinger
#24122: Install fails after configure sets the extending/embedding ins
http://bugs.python.org/issue24122 closed by doko
#24123: Python 2.7 Tutorial Conflicting behavior with WeakValueDiction
http://bugs.python.org/issue24123 closed by jessembacon
#24125: Fix for #23865 breaks docutils
http://bugs.python.org/issue24125 closed by serhiy.storchaka
#24128: Documentation links are forwarded to Python 2
http://bugs.python.org/issue24128 closed by r.david.murray
#24133: Add 'composable' decorator to functools (with @ matrix multipl
http://bugs.python.org/issue24133 closed by r.david.murray
#24134: assertRaises can behave differently
http://bugs.python.org/issue24134 closed by serhiy.storchaka
#24135: Policy for altering sys.path
http://bugs.python.org/issue24135 closed by r.david.murray
#24141: Python 3 ships an outdated valgrind suppressison file.
http://bugs.python.org/issue24141 closed by ned.deily
#24144: Docs discourage use of binascii.unhexlify etc.
http://bugs.python.org/issue24144 closed by r.david.murray
#24146: ast.literal_eval doesn't support the Python ternary operator
http://bugs.python.org/issue24146 closed by r.david.murray
[View Less]
1
0
Hi python-dev,
Updated version of the PEP is below.
Quick summary of changes:
1. set_coroutine_wrapper and get_coroutine_wrapper functions
are now thread-specific (like settrace etc).
2. Updated Abstract & Rationale sections.
3. RuntimeWarning is always raised when a coroutine wasn't
awaited on. This is in addition to what 'set_coroutine_wrapper'
will/can do.
4. asyncio.async is renamed to asyncio.ensure_future; it will
be deprecated in 3.5.
5. Uses of async/await in CPython …
[View More]codebase are documented.
6. Other small edits and updates.
Thanks,
Yury
PEP: 492
Title: Coroutines with async and await syntax
Version: $Revision$
Last-Modified: $Date$
Author: Yury Selivanov <yselivanov(a)sprymix.com>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 09-Apr-2015
Python-Version: 3.5
Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015, 29-Apr-2015,
05-May-2015
Abstract
========
The growth of Internet and general connectivity has triggered the
proportionate need for responsive and scalable code. This proposal
aims to answer that need by making writing explicitly asynchronous,
concurrent Python code easier and more Pythonic.
It is proposed to make *coroutines* a proper standalone concept in
Python, and introduce new supporting syntax. The ultimate goal
is to help establish a common, easily approachable, mental
model of asynchronous programming in Python and make it as close to
synchronous programming as possible.
We believe that the changes proposed here will help keep Python
relevant and competitive in a quickly growing area of asynchronous
programming, as many other languages have adopted, or are planning to
adopt, similar features: [2]_, [5]_, [6]_, [7]_, [8]_, [10]_.
Rationale and Goals
===================
Current Python supports implementing coroutines via generators (PEP
342), further enhanced by the ``yield from`` syntax introduced in PEP
380. This approach has a number of shortcomings:
* It is easy to confuse coroutines with regular generators, since they
share the same syntax; this is especially true for new developers.
* Whether or not a function is a coroutine is determined by a presence
of ``yield`` or ``yield from`` statements in its *body*, which can
lead to unobvious errors when such statements appear in or disappear
from function body during refactoring.
* Support for asynchronous calls is limited to expressions where
``yield`` is allowed syntactically, limiting the usefulness of
syntactic features, such as ``with`` and ``for`` statements.
This proposal makes coroutines a native Python language feature, and
clearly separates them from generators. This removes
generator/coroutine ambiguity, and makes it possible to reliably define
coroutines without reliance on a specific library. This also enables
linters and IDEs to improve static code analysis and refactoring.
Native coroutines and the associated new syntax features make it
possible to define context manager and iteration protocols in
asynchronous terms. As shown later in this proposal, the new ``async
with`` statement lets Python programs perform asynchronous calls when
entering and exiting a runtime context, and the new ``async for``
statement makes it possible to perform asynchronous calls in iterators.
Specification
=============
This proposal introduces new syntax and semantics to enhance coroutine
support in Python.
This specification presumes knowledge of the implementation of
coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax
changes proposed here comes from the asyncio framework (PEP 3156) and
the "Cofunctions" proposal (PEP 3152, now rejected in favor of this
specification).
From this point in this document we use the word *native coroutine* to
refer to functions declared using the new syntax. *generator-based
coroutine* is used where necessary to refer to coroutines that are
based on generator syntax. *coroutine* is used in contexts where both
definitions are applicable.
New Coroutine Declaration Syntax
--------------------------------
The following new syntax is used to declare a *native coroutine*::
async def read_data(db):
pass
Key properties of *coroutines*:
* ``async def`` functions are always coroutines, even if they do not
contain ``await`` expressions.
* It is a ``SyntaxError`` to have ``yield`` or ``yield from``
expressions in an ``async`` function.
* Internally, two new code object flags were introduced:
- ``CO_COROUTINE`` is used to enable runtime detection of
*coroutines* (and migrating existing code).
- ``CO_NATIVE_COROUTINE`` is used to mark *native coroutines*
(defined with new syntax.)
All coroutines have ``CO_COROUTINE``, ``CO_NATIVE_COROUTINE``, and
``CO_GENERATOR`` flags set.
* Regular generators, when called, return a *generator object*;
similarly, coroutines return a *coroutine object*.
* ``StopIteration`` exceptions are not propagated out of coroutines,
and are replaced with a ``RuntimeError``. For regular generators
such behavior requires a future import (see PEP 479).
* When a *coroutine* is garbage collected, a ``RuntimeWarning`` is
raised if it was never awaited on (see also `Debugging Features`_.)
* See also `Coroutine objects`_ section.
types.coroutine()
-----------------
A new function ``coroutine(gen)`` is added to the ``types`` module. It
allows interoperability between existing *generator-based coroutines*
in asyncio and *native coroutines* introduced by this PEP.
The function applies ``CO_COROUTINE`` flag to generator-function's code
object, making it return a *coroutine object*.
The function can be used as a decorator, since it modifies generator-
functions in-place and returns them.
Note, that the ``CO_NATIVE_COROUTINE`` flag is not applied by
``types.coroutine()`` to make it possible to separate *native
coroutines* defined with new syntax, from *generator-based coroutines*.
Await Expression
----------------
The following new ``await`` expression is used to obtain a result of
coroutine execution::
async def read_data(db):
data = await db.fetch('SELECT ...')
...
``await``, similarly to ``yield from``, suspends execution of
``read_data`` coroutine until ``db.fetch`` *awaitable* completes and
returns the result data.
It uses the ``yield from`` implementation with an extra step of
validating its argument. ``await`` only accepts an *awaitable*, which
can be one of:
* A *native coroutine object* returned from a *native coroutine*.
* A *generator-based coroutine object* returned from a generator
decorated with ``types.coroutine()``.
* An object with an ``__await__`` method returning an iterator.
Any ``yield from`` chain of calls ends with a ``yield``. This is a
fundamental mechanism of how *Futures* are implemented. Since,
internally, coroutines are a special kind of generators, every
``await`` is suspended by a ``yield`` somewhere down the chain of
``await`` calls (please refer to PEP 3156 for a detailed
explanation.)
To enable this behavior for coroutines, a new magic method called
``__await__`` is added. In asyncio, for instance, to enable *Future*
objects in ``await`` statements, the only change is to add
``__await__ = __iter__`` line to ``asyncio.Future`` class.
Objects with ``__await__`` method are called *Future-like* objects in
the rest of this PEP.
Also, please note that ``__aiter__`` method (see its definition
below) cannot be used for this purpose. It is a different protocol,
and would be like using ``__iter__`` instead of ``__call__`` for
regular callables.
It is a ``TypeError`` if ``__await__`` returns anything but an
iterator.
* Objects defined with CPython C API with a ``tp_await`` function,
returning an iterator (similar to ``__await__`` method).
It is a ``SyntaxError`` to use ``await`` outside of an ``async def``
function (like it is a ``SyntaxError`` to use ``yield`` outside of
``def`` function.)
It is a ``TypeError`` to pass anything other than an *awaitable* object
to an ``await`` expression.
Updated operator precedence table
'''''''''''''''''''''''''''''''''
``await`` keyword is defined as follows::
power ::= await ["**" u_expr]
await ::= ["await"] primary
where "primary" represents the most tightly bound operations of the
language. Its syntax is::
primary ::= atom | attributeref | subscription | slicing | call
See Python Documentation [12]_ and `Grammar Updates`_ section of this
proposal for details.
The key ``await`` difference from ``yield`` and ``yield from``
operators is that *await expressions* do not require parentheses around
them most of the times.
Also, ``yield from`` allows any expression as its argument, including
expressions like ``yield from a() + b()``, that would be parsed as
``yield from (a() + b())``, which is almost always a bug. In general,
the result of any arithmetic operation is not an *awaitable* object.
To avoid this kind of mistakes, it was decided to make ``await``
precedence lower than ``[]``, ``()``, and ``.``, but higher than ``**``
operators.
+------------------------------+-----------------------------------+
| Operator | Description |
+==============================+===================================+
| ``yield`` ``x``, | Yield expression |
| ``yield from`` ``x`` | |
+------------------------------+-----------------------------------+
| ``lambda`` | Lambda expression |
+------------------------------+-----------------------------------+
| ``if`` -- ``else`` | Conditional expression |
+------------------------------+-----------------------------------+
| ``or`` | Boolean OR |
+------------------------------+-----------------------------------+
| ``and`` | Boolean AND |
+------------------------------+-----------------------------------+
| ``not`` ``x`` | Boolean NOT |
+------------------------------+-----------------------------------+
| ``in``, ``not in``, | Comparisons, including membership |
| ``is``, ``is not``, ``<``, | tests and identity tests |
| ``<=``, ``>``, ``>=``, | |
| ``!=``, ``==`` | |
+------------------------------+-----------------------------------+
| ``|`` | Bitwise OR |
+------------------------------+-----------------------------------+
| ``^`` | Bitwise XOR |
+------------------------------+-----------------------------------+
| ``&`` | Bitwise AND |
+------------------------------+-----------------------------------+
| ``<<``, ``>>`` | Shifts |
+------------------------------+-----------------------------------+
| ``+``, ``-`` | Addition and subtraction |
+------------------------------+-----------------------------------+
| ``*``, ``@``, ``/``, ``//``, | Multiplication, matrix |
| ``%`` | multiplication, division, |
| | remainder |
+------------------------------+-----------------------------------+
| ``+x``, ``-x``, ``~x`` | Positive, negative, bitwise NOT |
+------------------------------+-----------------------------------+
| ``**`` | Exponentiation |
+------------------------------+-----------------------------------+
| ``await`` ``x`` | Await expression |
+------------------------------+-----------------------------------+
| ``x[index]``, | Subscription, slicing, |
| ``x[index:index]``, | call, attribute reference |
| ``x(arguments...)``, | |
| ``x.attribute`` | |
+------------------------------+-----------------------------------+
| ``(expressions...)``, | Binding or tuple display, |
| ``[expressions...]``, | list display, |
| ``{key: value...}``, | dictionary display, |
| ``{expressions...}`` | set display |
+------------------------------+-----------------------------------+
Examples of "await" expressions
'''''''''''''''''''''''''''''''
Valid syntax examples:
================================== ==================================
Expression Will be parsed as
================================== ==================================
``if await fut: pass`` ``if (await fut): pass``
``if await fut + 1: pass`` ``if (await fut) + 1: pass``
``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'``
``with await fut, open(): pass`` ``with (await fut), open(): pass``
``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )``
``return await coro()`` ``return ( await coro() )``
``res = await coro() ** 2`` ``res = (await coro()) ** 2``
``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)``
``await foo() + await bar()`` ``(await foo()) + (await bar())``
``-await foo()`` ``-(await foo())``
================================== ==================================
Invalid syntax examples:
================================== ==================================
Expression Should be written as
================================== ==================================
``await await coro()`` ``await (await coro())``
``await -coro()`` ``await (-coro())``
================================== ==================================
Asynchronous Context Managers and "async with"
----------------------------------------------
An *asynchronous context manager* is a context manager that is able to
suspend execution in its *enter* and *exit* methods.
To make this possible, a new protocol for asynchronous context managers
is proposed. Two new magic methods are added: ``__aenter__`` and
``__aexit__``. Both must return an *awaitable*.
An example of an asynchronous context manager::
class AsyncContextManager:
async def __aenter__(self):
await log('entering context')
async def __aexit__(self, exc_type, exc, tb):
await log('exiting context')
New Syntax
''''''''''
A new statement for asynchronous context managers is proposed::
async with EXPR as VAR:
BLOCK
which is semantically equivalent to::
mgr = (EXPR)
aexit = type(mgr).__aexit__
aenter = type(mgr).__aenter__(mgr)
exc = True
try:
VAR = await aenter
BLOCK
except:
if not await aexit(mgr, *sys.exc_info()):
raise
else:
await aexit(mgr, None, None, None)
As with regular ``with`` statements, it is possible to specify multiple
context managers in a single ``async with`` statement.
It is an error to pass a regular context manager without ``__aenter__``
and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError``
to use ``async with`` outside of an ``async def`` function.
Example
'''''''
With *asynchronous context managers* it is easy to implement proper
database transaction managers for coroutines::
async def commit(session, data):
...
async with session.transaction():
...
await session.update(data)
...
Code that needs locking also looks lighter::
async with lock:
...
instead of::
with (yield from lock):
...
Asynchronous Iterators and "async for"
--------------------------------------
An *asynchronous iterable* is able to call asynchronous code in its
*iter* implementation, and *asynchronous iterator* can call
asynchronous code in its *next* method. To support asynchronous
iteration:
1. An object must implement an ``__aiter__`` method returning an
*awaitable* resulting in an *asynchronous iterator object*.
2. An *asynchronous iterator object* must implement an ``__anext__``
method returning an *awaitable*.
3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
exception.
An example of asynchronous iterable::
class AsyncIterable:
async def __aiter__(self):
return self
async def __anext__(self):
data = await self.fetch_data()
if data:
return data
else:
raise StopAsyncIteration
async def fetch_data(self):
...
New Syntax
''''''''''
A new statement for iterating through asynchronous iterators is
proposed::
async for TARGET in ITER:
BLOCK
else:
BLOCK2
which is semantically equivalent to::
iter = (ITER)
iter = await type(iter).__aiter__(iter)
running = True
while running:
try:
TARGET = await type(iter).__anext__(iter)
except StopAsyncIteration:
running = False
else:
BLOCK
else:
BLOCK2
It is a ``TypeError`` to pass a regular iterable without ``__aiter__``
method to ``async for``. It is a ``SyntaxError`` to use ``async for``
outside of an ``async def`` function.
As for with regular ``for`` statement, ``async for`` has an optional
``else`` clause.
Example 1
'''''''''
With asynchronous iteration protocol it is possible to asynchronously
buffer data during iteration::
async for data in cursor:
...
Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows
of data from a database after every ``N`` iterations.
The following code illustrates new asynchronous iteration protocol::
class Cursor:
def __init__(self):
self.buffer = collections.deque()
def _prefetch(self):
...
async def __aiter__(self):
return self
async def __anext__(self):
if not self.buffer:
self.buffer = await self._prefetch()
if not self.buffer:
raise StopAsyncIteration
return self.buffer.popleft()
then the ``Cursor`` class can be used as follows::
async for row in Cursor():
print(row)
which would be equivalent to the following code::
i = await Cursor().__aiter__()
while True:
try:
row = await i.__anext__()
except StopAsyncIteration:
break
else:
print(row)
Example 2
'''''''''
The following is a utility class that transforms a regular iterable to
an asynchronous one. While this is not a very useful thing to do, the
code illustrates the relationship between regular and asynchronous
iterators.
::
class AsyncIteratorWrapper:
def __init__(self, obj):
self._it = iter(obj)
async def __aiter__(self):
return self
async def __anext__(self):
try:
value = next(self._it)
except StopIteration:
raise StopAsyncIteration
return value
async for letter in AsyncIteratorWrapper("abc"):
print(letter)
Why StopAsyncIteration?
'''''''''''''''''''''''
Coroutines are still based on generators internally. So, before PEP
479, there was no fundamental difference between
::
def g1():
yield from fut
return 'spam'
and
::
def g2():
yield from fut
raise StopIteration('spam')
And since PEP 479 is accepted and enabled by default for coroutines,
the following example will have its ``StopIteration`` wrapped into a
``RuntimeError``
::
async def a1():
await fut
raise StopIteration('spam')
The only way to tell the outside code that the iteration has ended is
to raise something other than ``StopIteration``. Therefore, a new
built-in exception class ``StopAsyncIteration`` was added.
Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions
raised in coroutines are wrapped in ``RuntimeError``.
Coroutine objects
-----------------
Differences from generators
'''''''''''''''''''''''''''
This section applies only to *native coroutines* with
``CO_NATIVE_COROUTINE`` flag, i.e. defined with the new ``async def``
syntax.
**The behavior of existing *generator-based coroutines* in asyncio
remains unchanged.**
Great effort has been made to make sure that coroutines and
generators are treated as distinct concepts:
1. *Native coroutine objects* do not implement ``__iter__`` and
``__next__`` methods. Therefore, they cannot be iterated over or
passed to ``iter()``, ``list()``, ``tuple()`` and other built-ins.
They also cannot be used in a ``for..in`` loop.
An attempt to use ``__iter__`` or ``__next__`` on a *native
coroutine object* will result in a ``TypeError``.
2. *Plain generators* cannot ``yield from`` *native coroutine objects*:
doing so will result in a ``TypeError``.
3. *generator-based coroutines* (for asyncio code must be decorated
with ``(a)asyncio.coroutine``) can ``yield from`` *native coroutine
objects*.
4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()``
return ``False`` for *native coroutine objects* and *native
coroutine functions*.
Coroutine object methods
''''''''''''''''''''''''
Coroutines are based on generators internally, thus they share the
implementation. Similarly to generator objects, coroutine objects have
``throw()``, ``send()`` and ``close()`` methods. ``StopIteration`` and
``GeneratorExit`` play the same role for coroutine objects (although
PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380,
and Python Documentation [11]_ for details.
``throw()``, ``send()`` methods for coroutine objects are used to push
values and raise errors into *Future-like* objects.
Debugging Features
------------------
A common beginner mistake is forgetting to use ``yield from`` on
coroutines::
@asyncio.coroutine
def useful():
asyncio.sleep(1) # this will do noting without 'yield from'
For debugging this kind of mistakes there is a special debug mode in
asyncio, in which ``@coroutine`` decorator wraps all functions with a
special object with a destructor logging a warning. Whenever a wrapped
generator gets garbage collected, a detailed logging message is
generated with information about where exactly the decorator function
was defined, stack trace of where it was collected, etc. Wrapper
object also provides a convenient ``__repr__`` function with detailed
information about the generator.
The only problem is how to enable these debug capabilities. Since
debug facilities should be a no-op in production mode, ``@coroutine``
decorator makes the decision of whether to wrap or not to wrap based on
an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is
possible to run asyncio programs with asyncio's own functions
instrumented. ``EventLoop.set_debug``, a different debug facility, has
no impact on ``@coroutine`` decorator's behavior.
With this proposal, coroutines is a native, distinct from generators,
concept. *In addition* to a ``RuntimeWarning`` being raised on
coroutines that were never awaited, it is proposed to add two new
functions to the ``sys`` module: ``set_coroutine_wrapper`` and
``get_coroutine_wrapper``. This is to enable advanced debugging
facilities in asyncio and other frameworks (such as displaying where
exactly coroutine was created, and a more detailed stack trace of where
it was garbage collected).
New Standard Library Functions
------------------------------
* ``types.coroutine(gen)``. See `types.coroutine()`_ section for
details.
* ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a
*coroutine object*.
* ``inspect.iscoroutinefunction(obj)`` returns ``True`` if ``obj`` is a
*coroutine function*.
* ``inspect.isawaitable(obj)`` returns ``True`` if ``obj`` can be used
in ``await`` expression. See `Await Expression`_ for details.
* ``sys.set_coroutine_wrapper(wrapper)`` allows to intercept creation
of *coroutine objects*. ``wrapper`` must be a callable that accepts
one argument: a *coroutine object* or ``None``. ``None`` resets the
wrapper. If called twice, the new wrapper replaces the previous one.
The function is thread-specific. See `Debugging Features`_ for more
details.
* ``sys.get_coroutine_wrapper()`` returns the current wrapper object.
Returns ``None`` if no wrapper was set. The function is
thread-specific. See `Debugging Features`_ for more details.
Glossary
========
:Native coroutine:
A coroutine function is declared with ``async def``. It uses
``await`` and ``return value``; see `New Coroutine Declaration
Syntax`_ for details.
:Native coroutine object:
Returned from a native coroutine function. See `Await Expression`_
for details.
:Generator-based coroutine:
Coroutines based on generator syntax. Most common example are
functions decorated with ``(a)asyncio.coroutine``.
:Generator-based coroutine object:
Returned from a generator-based coroutine function.
:Coroutine:
Either *native coroutine* or *generator-based coroutine*.
:Coroutine object:
Either *native coroutine object* or *generator-based coroutine
object*.
:Future-like object:
An object with an ``__await__`` method, or a C object with
``tp_await`` function, returning an iterator. Can be consumed by
an ``await`` expression in a coroutine. A coroutine waiting for a
Future-like object is suspended until the Future-like object's
``__await__`` completes, and returns the result. See `Await
Expression`_ for details.
:Awaitable:
A *Future-like* object or a *coroutine object*. See `Await
Expression`_ for details.
:Asynchronous context manager:
An asynchronous context manager has ``__aenter__`` and ``__aexit__``
methods and can be used with ``async with``. See `Asynchronous
Context Managers and "async with"`_ for details.
:Asynchronous iterable:
An object with an ``__aiter__`` method, which must return an
*asynchronous iterator* object. Can be used with ``async for``.
See `Asynchronous Iterators and "async for"`_ for details.
:Asynchronous iterator:
An asynchronous iterator has an ``__anext__`` method. See
`Asynchronous Iterators and "async for"`_ for details.
List of functions and methods
=============================
================= =================================== =================
Method Can contain Can't contain
================= =================================== =================
async def func await, return value yield, yield from
async def __a*__ await, return value yield, yield from
def __a*__ return awaitable await
def __await__ yield, yield from, return iterable await
generator yield, yield from, return value await
================= =================================== =================
Where:
* "async def func": native coroutine;
* "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
``__aexit__`` defined with the ``async`` keyword;
* "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
``__aexit__`` defined without the ``async`` keyword, must return an
*awaitable*;
* "def __await__": ``__await__`` method to implement *Future-like*
objects;
* generator: a "regular" generator, function defined with ``def`` and
which contains a least one ``yield`` or ``yield from`` expression.
Transition Plan
===============
To avoid backwards compatibility issues with ``async`` and ``await``
keywords, it was decided to modify ``tokenizer.c`` in such a way, that
it:
* recognizes ``async def`` ``NAME`` tokens combination;
* keeps track of regular ``def`` and ``async def`` indented blocks;
* while tokenizing ``async def`` block, it replaces ``'async'``
``NAME`` token with ``ASYNC``, and ``'await'`` ``NAME`` token with
``AWAIT``;
* while tokenizing ``def`` block, it yields ``'async'`` and ``'await'``
``NAME`` tokens as is.
This approach allows for seamless combination of new syntax features
(all of them available only in ``async`` functions) with any existing
code.
An example of having "async def" and "async" attribute in one piece of
code::
class Spam:
async = 42
async def ham():
print(getattr(Spam, 'async'))
# The coroutine can be executed and will print '42'
Backwards Compatibility
-----------------------
This proposal preserves 100% backwards compatibility.
asyncio
'''''''
``asyncio`` module was adapted and tested to work with coroutines and
new statements. Backwards compatibility is 100% preserved, i.e. all
existing code will work as-is.
The required changes are mainly:
1. Modify ``(a)asyncio.coroutine`` decorator to use new
``types.coroutine()`` function.
2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
3. Add ``ensure_future()`` as an alias for ``async()`` function.
Deprecate ``async()`` function.
asyncio migration strategy
''''''''''''''''''''''''''
Because *plain generators* cannot ``yield from`` *native coroutine
objects* (see `Differences from generators`_ section for more details),
it is advised to make sure that all generator-based coroutines are
decorated with ``(a)asyncio.coroutine`` *before* starting to use the new
syntax.
async/await in CPython code base
''''''''''''''''''''''''''''''''
There is no use of ``await`` names in CPython.
``async`` is mostly used by asyncio. We are addressing this by
renaming ``async()`` function to ``ensure_future()`` (see `asyncio`_
section for details.)
Another use of ``async`` keyword is in ``Lib/xml/dom/xmlbuilder.py``,
to define an ``async = False`` attribute for ``DocumentLS`` class.
There is no documentation or tests for it, it is not used anywhere else
in CPython. It is replaced with a getter, that raises a
``DeprecationWarning``, advising to use ``async_`` attribute instead.
'async' attribute is not documented and is not used in CPython code
base.
Grammar Updates
---------------
Grammar changes are fairly minimal::
decorated: decorators (classdef | funcdef | async_funcdef)
async_funcdef: ASYNC funcdef
compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt
| funcdef | classdef | decorated | async_stmt)
async_stmt: ASYNC (funcdef | with_stmt | for_stmt)
power: atom_expr ['**' factor]
atom_expr: [AWAIT] atom trailer*
Transition Period Shortcomings
------------------------------
There is just one.
Until ``async`` and ``await`` are not proper keywords, it is not
possible (or at least very hard) to fix ``tokenizer.c`` to recognize
them on the **same line** with ``def`` keyword::
# async and await will always be parsed as variables
async def outer(): # 1
def nested(a=(await fut)):
pass
async def foo(): return (await fut) # 2
Since ``await`` and ``async`` in such cases are parsed as ``NAME``
tokens, a ``SyntaxError`` will be raised.
To workaround these issues, the above examples can be easily rewritten
to a more readable form::
async def outer(): # 1
a_default = await fut
def nested(a=a_default):
pass
async def foo(): # 2
return (await fut)
This limitation will go away as soon as ``async`` and ``await`` are
proper keywords.
Deprecation Plans
-----------------
``async`` and ``await`` names will be softly deprecated in CPython 3.5
and 3.6. In 3.7 we will transform them to proper keywords. Making
``async`` and ``await`` proper keywords before 3.7 might make it harder
for people to port their code to Python 3.
Design Considerations
=====================
PEP 3152
--------
PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines
(called "cofunctions"). Some key points:
1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is
always a generator, even if there is no ``cocall`` expressions
inside it. Maps to ``async def`` in this proposal.
2. A new keyword ``cocall`` to call a *cofunction*. Can only be used
inside a *cofunction*. Maps to ``await`` in this proposal (with
some differences, see below.)
3. It is not possible to call a *cofunction* without a ``cocall``
keyword.
4. ``cocall`` grammatically requires parentheses after it::
atom: cocall | <existing alternatives for atom>
cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
cotrailer: '[' subscriptlist ']' | '.' NAME
5. ``cocall f(*args, **kwds)`` is semantically equivalent to
``yield from f.__cocall__(*args, **kwds)``.
Differences from this proposal:
1. There is no equivalent of ``__cocall__`` in this PEP, which is
called and its result is passed to ``yield from`` in the ``cocall``
expression. ``await`` keyword expects an *awaitable* object,
validates the type, and executes ``yield from`` on it. Although,
``__await__`` method is similar to ``__cocall__``, but is only used
to define *Future-like* objects.
2. ``await`` is defined in almost the same way as ``yield from`` in the
grammar (it is later enforced that ``await`` can only be inside
``async def``). It is possible to simply write ``await future``,
whereas ``cocall`` always requires parentheses.
3. To make asyncio work with PEP 3152 it would be required to modify
``(a)asyncio.coroutine`` decorator to wrap all functions in an object
with a ``__cocall__`` method, or to implement ``__cocall__`` on
generators. To call *cofunctions* from existing generator-based
coroutines it would be required to use ``costart(cofunc, *args,
**kwargs)`` built-in.
4. Since it is impossible to call a *cofunction* without a ``cocall``
keyword, it automatically prevents the common mistake of forgetting
to use ``yield from`` on generator-based coroutines. This proposal
addresses this problem with a different approach, see `Debugging
Features`_.
5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine
is that if is decided to implement coroutine-generators --
coroutines with ``yield`` or ``async yield`` expressions -- we
wouldn't need a ``cocall`` keyword to call them. So we'll end up
having ``__cocall__`` and no ``__call__`` for regular coroutines,
and having ``__call__`` and no ``__cocall__`` for coroutine-
generators.
6. Requiring parentheses grammatically also introduces a whole lot
of new problems.
The following code::
await fut
await function_returning_future()
await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2))
would look like::
cocall fut() # or cocall costart(fut)
cocall (function_returning_future())()
cocall asyncio.gather(costart(coro1, arg1, arg2),
costart(coro2, arg1, arg2))
7. There are no equivalents of ``async for`` and ``async with`` in PEP
3152.
Coroutine-generators
--------------------
With ``async for`` keyword it is desirable to have a concept of a
*coroutine-generator* -- a coroutine with ``yield`` and ``yield from``
expressions. To avoid any ambiguity with regular generators, we would
likely require to have an ``async`` keyword before ``yield``, and
``async yield from`` would raise a ``StopAsyncIteration`` exception.
While it is possible to implement coroutine-generators, we believe that
they are out of scope of this proposal. It is an advanced concept that
should be carefully considered and balanced, with a non-trivial changes
in the implementation of current generator objects. This is a matter
for a separate PEP.
Why "async" and "await" keywords
--------------------------------
async/await is not a new concept in programming languages:
* C# has it since long time ago [5]_;
* proposal to add async/await in ECMAScript 7 [2]_;
see also Traceur project [9]_;
* Facebook's Hack/HHVM [6]_;
* Google's Dart language [7]_;
* Scala [8]_;
* proposal to add async/await to C++ [10]_;
* and many other less popular languages.
This is a huge benefit, as some users already have experience with
async/await, and because it makes working with many languages in one
project easier (Python with ECMAScript 7 for instance).
Why "__aiter__" returns awaitable
---------------------------------
In principle, ``__aiter__`` could be a regular function. There are
several good reasons to make it a coroutine:
* as most of the ``__anext__``, ``__aenter__``, and ``__aexit__``
methods are coroutines, users would often make a mistake defining it
as ``async`` anyways;
* there might be a need to run some asynchronous operations in
``__aiter__``, for instance to prepare DB queries or do some file
operation.
Importance of "async" keyword
-----------------------------
While it is possible to just implement ``await`` expression and treat
all functions with at least one ``await`` as coroutines, this approach
makes APIs design, code refactoring and its long time support harder.
Let's pretend that Python only has ``await`` keyword::
def useful():
...
await log(...)
...
def important():
await useful()
If ``useful()`` function is refactored and someone removes all
``await`` expressions from it, it would become a regular python
function, and all code that depends on it, including ``important()``
would be broken. To mitigate this issue a decorator similar to
``(a)asyncio.coroutine`` has to be introduced.
Why "async def"
---------------
For some people bare ``async name(): pass`` syntax might look more
appealing than ``async def name(): pass``. It is certainly easier to
type. But on the other hand, it breaks the symmetry between ``async
def``, ``async with`` and ``async for``, where ``async`` is a modifier,
stating that the statement is asynchronous. It is also more consistent
with the existing grammar.
Why not "await for" and "await with"
------------------------------------
``async`` is an adjective, and hence it is a better choice for a
*statement qualifier* keyword. ``await for/with`` would imply that
something is awaiting for a completion of a ``for`` or ``with``
statement.
Why "async def" and not "def async"
-----------------------------------
``async`` keyword is a *statement qualifier*. A good analogy to it are
"static", "public", "unsafe" keywords from other languages. "async
for" is an asynchronous "for" statement, "async with" is an
asynchronous "with" statement, "async def" is an asynchronous function.
Having "async" after the main statement keyword might introduce some
confusion, like "for async item in iterator" can be read as "for each
asynchronous item in iterator".
Having ``async`` keyword before ``def``, ``with`` and ``for`` also
makes the language grammar simpler. And "async def" better separates
coroutines from regular functions visually.
Why not a __future__ import
---------------------------
`Transition Plan`_ section explains how tokenizer is modified to treat
``async`` and ``await`` as keywords *only* in ``async def`` blocks.
Hence ``async def`` fills the role that a module level compiler
declaration like ``from __future__ import async_await`` would otherwise
fill.
Why magic methods start with "a"
--------------------------------
New asynchronous magic methods ``__aiter__``, ``__anext__``,
``__aenter__``, and ``__aexit__`` all start with the same prefix "a".
An alternative proposal is to use "async" prefix, so that ``__aiter__``
becomes ``__async_iter__``. However, to align new magic methods with
the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided
to use a shorter version.
Why not reuse existing magic names
----------------------------------
An alternative idea about new asynchronous iterators and context
managers was to reuse existing magic methods, by adding an ``async``
keyword to their declarations::
class CM:
async def __enter__(self): # instead of __aenter__
...
This approach has the following downsides:
* it would not be possible to create an object that works in both
``with`` and ``async with`` statements;
* it would break backwards compatibility, as nothing prohibits from
returning a Future-like objects from ``__enter__`` and/or
``__exit__`` in Python <= 3.4;
* one of the main points of this proposal is to make native coroutines
as simple and foolproof as possible, hence the clear separation of
the protocols.
Why not reuse existing "for" and "with" statements
--------------------------------------------------
The vision behind existing generator-based coroutines and this proposal
is to make it easy for users to see where the code might be suspended.
Making existing "for" and "with" statements to recognize asynchronous
iterators and context managers will inevitably create implicit suspend
points, making it harder to reason about the code.
Comprehensions
--------------
Syntax for asynchronous comprehensions could be provided, but this
construct is outside of the scope of this PEP.
Async lambda functions
----------------------
Syntax for asynchronous lambda functions could be provided, but this
construct is outside of the scope of this PEP.
Performance
===========
Overall Impact
--------------
This proposal introduces no observable performance impact. Here is an
output of python's official set of benchmarks [4]_:
::
python perf.py -r -b default ../cpython/python.exe
../cpython-aw/python.exe
[skipped]
Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0:
Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64
x86_64 i386
Total CPU cores: 8
### etree_iterparse ###
Min: 0.365359 -> 0.349168: 1.05x faster
Avg: 0.396924 -> 0.379735: 1.05x faster
Significant (t=9.71)
Stddev: 0.01225 -> 0.01277: 1.0423x larger
The following not significant results are hidden, use -v to show them:
django_v2, 2to3, etree_generate, etree_parse, etree_process,
fastpickle,
fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.
Tokenizer modifications
-----------------------
There is no observable slowdown of parsing python files with the
modified tokenizer: parsing of one 12Mb file
(``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount
of time.
async/await
-----------
The following micro-benchmark was used to determine performance
difference between "async" functions and generators::
import sys
import time
def binary(n):
if n <= 0:
return 1
l = yield from binary(n - 1)
r = yield from binary(n - 1)
return l + 1 + r
async def abinary(n):
if n <= 0:
return 1
l = await abinary(n - 1)
r = await abinary(n - 1)
return l + 1 + r
def timeit(gen, depth, repeat):
t0 = time.time()
for _ in range(repeat):
list(gen(depth))
t1 = time.time()
print('{}({}) * {}: total {:.3f}s'.format(
gen.__name__, depth, repeat, t1-t0))
The result is that there is no observable performance difference.
Minimum timing of 3 runs
::
abinary(19) * 30: total 12.985s
binary(19) * 30: total 12.953s
Note that depth of 19 means 1,048,575 calls.
Reference Implementation
========================
The reference implementation can be found here: [3]_.
List of high-level changes and new protocols
--------------------------------------------
1. New syntax for defining coroutines: ``async def`` and new ``await``
keyword.
2. New ``__await__`` method for Future-like objects, and new
``tp_await`` slot in ``PyTypeObject``.
3. New syntax for asynchronous context managers: ``async with``. And
associated protocol with ``__aenter__`` and ``__aexit__`` methods.
4. New syntax for asynchronous iteration: ``async for``. And
associated protocol with ``__aiter__``, ``__aexit__`` and new built-
in exception ``StopAsyncIteration``.
5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``,
``Await``.
6. New functions: ``sys.set_coroutine_wrapper(callback)``,
``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``,
``inspect.iscoroutinefunction()``, ``inspect.iscoroutine()``,
and ``inspect.isawaitable()``.
7. New ``CO_COROUTINE`` and ``CO_NATIVE_COROUTINE`` bit flags for code
objects.
While the list of changes and new things is not short, it is important
to understand, that most users will not use these features directly.
It is intended to be used in frameworks and libraries to provide users
with convenient to use and unambiguous APIs with ``async def``,
``await``, ``async for`` and ``async with`` syntax.
Working example
---------------
All concepts proposed in this PEP are implemented [3]_ and can be
tested.
::
import asyncio
async def echo_server():
print('Serving on localhost:8000')
await asyncio.start_server(handle_connection,
'localhost', 8000)
async def handle_connection(reader, writer):
print('New connection...')
while True:
data = await reader.read(8192)
if not data:
break
print('Sending {:.10}... back'.format(repr(data)))
writer.write(data)
loop = asyncio.get_event_loop()
loop.run_until_complete(echo_server())
try:
loop.run_forever()
finally:
loop.close()
References
==========
.. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
.. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions
.. [3] https://github.com/1st1/cpython/tree/await
.. [4] https://hg.python.org/benchmarks
.. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx
.. [6] http://docs.hhvm.com/manual/en/hack.async.php
.. [7] https://www.dartlang.org/articles/await-async/
.. [8] http://docs.scala-lang.org/sips/pending/async.html
.. [9]
https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-func…
.. [10]
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)
.. [11]
https://docs.python.org/3/reference/expressions.html#generator-iterator-met…
.. [12] https://docs.python.org/3/reference/expressions.html#primaries
Acknowledgments
===============
I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew
Svetlov, and Łukasz Langa for their initial feedback.
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:
[View Less]
10
27
>From the PEP:
> Why not a __future__ import
>
> __future__ imports are inconvenient and easy to forget to add.
That is a horrible rationale for not using an import. By that logic we
should have everything in built-ins. ;)
> Working example
> ...
The working example only uses async def and await, not async with
nor async for nor __aenter__, etc., etc.
Could you put in a more complete example -- maybe a basic chat room
with both server and client -- that demonstrated …
[View More]more of the new
possibilities?
Having gone through the PEP again, I am still no closer to understanding
what happens here:
data = await reader.read(8192)
What does the flow of control look like at the interpreter level?
--
~Ethan~
[View Less]
12
27
Hi python-dev,
New version of the PEP is attached. Summary of updates:
1. Terminology:
- *native coroutine* term is used for "async def" functions.
- *generator-based coroutine* term is used for PEP 380
coroutines used in asyncio.
- *coroutine* is used when *native coroutine* or
*generator based coroutine* can be used in the same
context.
I think that it's not really productive to discuss the
terminology that we use in the PEP. Its only purpose is
to disambiguate concepts used in the …
[View More]PEP. We should discuss
how we will name new 'async def' coroutines in Python
Documentation if the PEP is accepted. Although if you
notice that somewhere in the PEP it is not crystal clear
what "coroutine" means please give me a heads up!
2. Syntax of await expressions is now thoroghly defined
in the PEP. See "Await Expression", "Updated operator
precedence table", and "Examples of "await" expressions"
sections.
I like the current approach. I'm still not convinced
that we should make 'await' the same grammatically as
unary minus.
I don't understand why we should allow parsing things
like 'await -fut'; this should be a SyntaxError.
I'm fine to modify the grammar to allow 'await await fut'
syntax, though. And I'm open to discussion :)
3. CO_NATIVE_COROUTINE flag. This enables us to disable
__iter__ and __next__ on native coroutines while maintaining
full backwards compatibility.
4. asyncio / Migration strategy. Existing code can
be used with PEP 492 as is, everything will work as
expected.
However, since *plain generator*s (not decorated with
@asyncio.coroutine) cannot 'yield from' native coroutines
(async def functions), it might break some code
*while adapting it to the new syntax*.
I'm open to just throw a RuntimeWarning in this case
in 3.5, and raise a TypeError in 3.6.
Please see the "Differences from generators" section of
the PEP.
5. inspect.isawaitable() function. And, all new functions
are now listed in the "New Standard Library Functions"
section.
6. Lot's of small updates and tweaks throughout the PEP.
Thanks,
Yury
PEP: 492
Title: Coroutines with async and await syntax
Version: $Revision$
Last-Modified: $Date$
Author: Yury Selivanov <yselivanov(a)sprymix.com>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 09-Apr-2015
Python-Version: 3.5
Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015, 29-Apr-2015
Abstract
========
This PEP introduces new syntax for coroutines, asynchronous ``with``
statements and ``for`` loops. The main motivation behind this proposal
is to streamline writing and maintaining asynchronous code, as well as
to simplify previously hard to implement code patterns.
Rationale and Goals
===================
Current Python supports implementing coroutines via generators (PEP
342), further enhanced by the ``yield from`` syntax introduced in PEP
380. This approach has a number of shortcomings:
* it is easy to confuse coroutines with regular generators, since they
share the same syntax; async libraries often attempt to alleviate
this by using decorators (e.g. ``(a)asyncio.coroutine`` [1]_);
* it is not possible to natively define a coroutine which has no
``yield`` or ``yield from`` statements, again requiring the use of
decorators to fix potential refactoring issues;
* support for asynchronous calls is limited to expressions where
``yield`` is allowed syntactically, limiting the usefulness of
syntactic features, such as ``with`` and ``for`` statements.
This proposal makes coroutines a native Python language feature, and
clearly separates them from generators. This removes
generator/coroutine ambiguity, and makes it possible to reliably define
coroutines without reliance on a specific library. This also enables
linters and IDEs to improve static code analysis and refactoring.
Native coroutines and the associated new syntax features make it
possible to define context manager and iteration protocols in
asynchronous terms. As shown later in this proposal, the new ``async
with`` statement lets Python programs perform asynchronous calls when
entering and exiting a runtime context, and the new ``async for``
statement makes it possible to perform asynchronous calls in iterators.
Specification
=============
This proposal introduces new syntax and semantics to enhance coroutine
support in Python.
This specification presumes knowledge of the implementation of
coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax
changes proposed here comes from the asyncio framework (PEP 3156) and
the "Cofunctions" proposal (PEP 3152, now rejected in favor of this
specification).
From this point in this document we use the word *native coroutine* to
refer to functions declared using the new syntax. *generator-based
coroutine* is used where necessary to refer to coroutines that are
based on generator syntax. *coroutine* is used in contexts where both
definitions are applicable.
New Coroutine Declaration Syntax
--------------------------------
The following new syntax is used to declare a *native coroutine*::
async def read_data(db):
pass
Key properties of *native coroutines*:
* ``async def`` functions are always coroutines, even if they do not
contain ``await`` expressions.
* It is a ``SyntaxError`` to have ``yield`` or ``yield from``
expressions in an ``async`` function.
* Internally, two new code object flags were introduced:
- ``CO_COROUTINE`` is used to enable runtime detection of
*coroutines* (and migrating existing code).
- ``CO_NATIVE_COROUTINE`` is used to mark *native coroutines*
(defined with new syntax.)
All coroutines have ``CO_COROUTINE``, ``CO_NATIVE_COROUTINE``, and
``CO_GENERATOR`` flags set.
* Regular generators, when called, return a *generator object*;
similarly, coroutines return a *coroutine object*.
* ``StopIteration`` exceptions are not propagated out of coroutines,
and are replaced with a ``RuntimeError``. For regular generators
such behavior requires a future import (see PEP 479).
* See also `Coroutine objects`_ section.
types.coroutine()
-----------------
A new function ``coroutine(gen)`` is added to the ``types`` module. It
allows interoperability between existing *generator-based coroutines*
in asyncio and *native coroutines* introduced by this PEP.
The function applies ``CO_COROUTINE`` flag to generator-function's code
object, making it return a *coroutine object*.
The function can be used as a decorator, since it modifies generator-
functions in-place and returns them.
Note, that the ``CO_NATIVE_COROUTINE`` flag is not applied by
``types.coroutine()`` to make it possible to separate *native
coroutines* defined with new syntax, from *generator-based coroutines*.
Await Expression
----------------
The following new ``await`` expression is used to obtain a result of
coroutine execution::
async def read_data(db):
data = await db.fetch('SELECT ...')
...
``await``, similarly to ``yield from``, suspends execution of
``read_data`` coroutine until ``db.fetch`` *awaitable* completes and
returns the result data.
It uses the ``yield from`` implementation with an extra step of
validating its argument. ``await`` only accepts an *awaitable*, which
can be one of:
* A *native coroutine object* returned from a *native coroutine*.
* A *generator-based coroutine object* returned from a generator
decorated with ``types.coroutine()``.
* An object with an ``__await__`` method returning an iterator.
Any ``yield from`` chain of calls ends with a ``yield``. This is a
fundamental mechanism of how *Futures* are implemented. Since,
internally, coroutines are a special kind of generators, every
``await`` is suspended by a ``yield`` somewhere down the chain of
``await`` calls (please refer to PEP 3156 for a detailed
explanation.)
To enable this behavior for coroutines, a new magic method called
``__await__`` is added. In asyncio, for instance, to enable *Future*
objects in ``await`` statements, the only change is to add
``__await__ = __iter__`` line to ``asyncio.Future`` class.
Objects with ``__await__`` method are called *Future-like* objects in
the rest of this PEP.
Also, please note that ``__aiter__`` method (see its definition
below) cannot be used for this purpose. It is a different protocol,
and would be like using ``__iter__`` instead of ``__call__`` for
regular callables.
It is a ``TypeError`` if ``__await__`` returns anything but an
iterator.
* Objects defined with CPython C API with a ``tp_await`` function,
returning an iterator (similar to ``__await__`` method).
It is a ``SyntaxError`` to use ``await`` outside of an ``async def``
function (like it is a ``SyntaxError`` to use ``yield`` outside of
``def`` function.)
It is a ``TypeError`` to pass anything other than an *awaitable* object
to an ``await`` expression.
Updated operator precedence table
'''''''''''''''''''''''''''''''''
``await`` keyword is defined as follows::
power ::= await ["**" u_expr]
await ::= ["await"] primary
where "primary" represents the most tightly bound operations of the
language. Its syntax is::
primary ::= atom | attributeref | subscription | slicing | call
See Python Documentation [12]_ and `Grammar Updates`_ section of this
proposal for details.
The key ``await`` difference from ``yield`` and ``yield from``
operators is that *await expressions* do not require parentheses around
them most of the times.
Also, ``yield from`` allows any expression as its argument, including
expressions like ``yield from a() + b()``, that would be parsed as
``yield from (a() + b())``, which is almost always a bug. In general,
the result of any arithmetic operation is not an *awaitable* object.
To avoid this kind of mistakes, it was decided to make ``await``
precedence lower than ``[]``, ``()``, and ``.``, but higher than ``**``
operators.
+------------------------------+-----------------------------------+
| Operator | Description |
+==============================+===================================+
| ``yield`` ``x``, | Yield expression |
| ``yield from`` ``x`` | |
+------------------------------+-----------------------------------+
| ``lambda`` | Lambda expression |
+------------------------------+-----------------------------------+
| ``if`` -- ``else`` | Conditional expression |
+------------------------------+-----------------------------------+
| ``or`` | Boolean OR |
+------------------------------+-----------------------------------+
| ``and`` | Boolean AND |
+------------------------------+-----------------------------------+
| ``not`` ``x`` | Boolean NOT |
+------------------------------+-----------------------------------+
| ``in``, ``not in``, | Comparisons, including membership |
| ``is``, ``is not``, ``<``, | tests and identity tests |
| ``<=``, ``>``, ``>=``, | |
| ``!=``, ``==`` | |
+------------------------------+-----------------------------------+
| ``|`` | Bitwise OR |
+------------------------------+-----------------------------------+
| ``^`` | Bitwise XOR |
+------------------------------+-----------------------------------+
| ``&`` | Bitwise AND |
+------------------------------+-----------------------------------+
| ``<<``, ``>>`` | Shifts |
+------------------------------+-----------------------------------+
| ``+``, ``-`` | Addition and subtraction |
+------------------------------+-----------------------------------+
| ``*``, ``@``, ``/``, ``//``, | Multiplication, matrix |
| ``%`` | multiplication, division, |
| | remainder |
+------------------------------+-----------------------------------+
| ``+x``, ``-x``, ``~x`` | Positive, negative, bitwise NOT |
+------------------------------+-----------------------------------+
| ``**`` | Exponentiation |
+------------------------------+-----------------------------------+
| ``await`` ``x`` | Await expression |
+------------------------------+-----------------------------------+
| ``x[index]``, | Subscription, slicing, |
| ``x[index:index]``, | call, attribute reference |
| ``x(arguments...)``, | |
| ``x.attribute`` | |
+------------------------------+-----------------------------------+
| ``(expressions...)``, | Binding or tuple display, |
| ``[expressions...]``, | list display, |
| ``{key: value...}``, | dictionary display, |
| ``{expressions...}`` | set display |
+------------------------------+-----------------------------------+
Examples of "await" expressions
'''''''''''''''''''''''''''''''
Valid syntax examples:
================================== ==================================
Expression Will be parsed as
================================== ==================================
``if await fut: pass`` ``if (await fut): pass``
``if await fut + 1: pass`` ``if (await fut) + 1: pass``
``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'``
``with await fut, open(): pass`` ``with (await fut), open(): pass``
``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )``
``return await coro()`` ``return ( await coro() )``
``res = await coro() ** 2`` ``res = (await coro()) ** 2``
``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)``
``await foo() + await bar()`` ``(await foo()) + (await bar())``
``-await foo()`` ``-(await foo())``
================================== ==================================
Invalid syntax examples:
================================== ==================================
Expression Should be written as
================================== ==================================
``await await coro()`` ``await (await coro())``
``await -coro()`` ``await (-coro())``
================================== ==================================
Asynchronous Context Managers and "async with"
----------------------------------------------
An *asynchronous context manager* is a context manager that is able to
suspend execution in its *enter* and *exit* methods.
To make this possible, a new protocol for asynchronous context managers
is proposed. Two new magic methods are added: ``__aenter__`` and
``__aexit__``. Both must return an *awaitable*.
An example of an asynchronous context manager::
class AsyncContextManager:
async def __aenter__(self):
await log('entering context')
async def __aexit__(self, exc_type, exc, tb):
await log('exiting context')
New Syntax
''''''''''
A new statement for asynchronous context managers is proposed::
async with EXPR as VAR:
BLOCK
which is semantically equivalent to::
mgr = (EXPR)
aexit = type(mgr).__aexit__
aenter = type(mgr).__aenter__(mgr)
exc = True
try:
VAR = await aenter
BLOCK
except:
if not await aexit(mgr, *sys.exc_info()):
raise
else:
await aexit(mgr, None, None, None)
As with regular ``with`` statements, it is possible to specify multiple
context managers in a single ``async with`` statement.
It is an error to pass a regular context manager without ``__aenter__``
and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError``
to use ``async with`` outside of an ``async def`` function.
Example
'''''''
With *asynchronous context managers* it is easy to implement proper
database transaction managers for coroutines::
async def commit(session, data):
...
async with session.transaction():
...
await session.update(data)
...
Code that needs locking also looks lighter::
async with lock:
...
instead of::
with (yield from lock):
...
Asynchronous Iterators and "async for"
--------------------------------------
An *asynchronous iterable* is able to call asynchronous code in its
*iter* implementation, and *asynchronous iterator* can call
asynchronous code in its *next* method. To support asynchronous
iteration:
1. An object must implement an ``__aiter__`` method returning an
*awaitable* resulting in an *asynchronous iterator object*.
2. An *asynchronous iterator object* must implement an ``__anext__``
method returning an *awaitable*.
3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
exception.
An example of asynchronous iterable::
class AsyncIterable:
async def __aiter__(self):
return self
async def __anext__(self):
data = await self.fetch_data()
if data:
return data
else:
raise StopAsyncIteration
async def fetch_data(self):
...
New Syntax
''''''''''
A new statement for iterating through asynchronous iterators is
proposed::
async for TARGET in ITER:
BLOCK
else:
BLOCK2
which is semantically equivalent to::
iter = (ITER)
iter = await type(iter).__aiter__(iter)
running = True
while running:
try:
TARGET = await type(iter).__anext__(iter)
except StopAsyncIteration:
running = False
else:
BLOCK
else:
BLOCK2
It is a ``TypeError`` to pass a regular iterable without ``__aiter__``
method to ``async for``. It is a ``SyntaxError`` to use ``async for``
outside of an ``async def`` function.
As for with regular ``for`` statement, ``async for`` has an optional
``else`` clause.
Example 1
'''''''''
With asynchronous iteration protocol it is possible to asynchronously
buffer data during iteration::
async for data in cursor:
...
Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows
of data from a database after every ``N`` iterations.
The following code illustrates new asynchronous iteration protocol::
class Cursor:
def __init__(self):
self.buffer = collections.deque()
def _prefetch(self):
...
async def __aiter__(self):
return self
async def __anext__(self):
if not self.buffer:
self.buffer = await self._prefetch()
if not self.buffer:
raise StopAsyncIteration
return self.buffer.popleft()
then the ``Cursor`` class can be used as follows::
async for row in Cursor():
print(row)
which would be equivalent to the following code::
i = await Cursor().__aiter__()
while True:
try:
row = await i.__anext__()
except StopAsyncIteration:
break
else:
print(row)
Example 2
'''''''''
The following is a utility class that transforms a regular iterable to
an asynchronous one. While this is not a very useful thing to do, the
code illustrates the relationship between regular and asynchronous
iterators.
::
class AsyncIteratorWrapper:
def __init__(self, obj):
self._it = iter(obj)
async def __aiter__(self):
return self
async def __anext__(self):
try:
value = next(self._it)
except StopIteration:
raise StopAsyncIteration
return value
async for letter in AsyncIteratorWrapper("abc"):
print(letter)
Why StopAsyncIteration?
'''''''''''''''''''''''
Coroutines are still based on generators internally. So, before PEP
479, there was no fundamental difference between
::
def g1():
yield from fut
return 'spam'
and
::
def g2():
yield from fut
raise StopIteration('spam')
And since PEP 479 is accepted and enabled by default for coroutines,
the following example will have its ``StopIteration`` wrapped into a
``RuntimeError``
::
async def a1():
await fut
raise StopIteration('spam')
The only way to tell the outside code that the iteration has ended is
to raise something other than ``StopIteration``. Therefore, a new
built-in exception class ``StopAsyncIteration`` was added.
Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions
raised in coroutines are wrapped in ``RuntimeError``.
Coroutine objects
-----------------
Differences from generators
'''''''''''''''''''''''''''
This section applies only to *native coroutines* with
``CO_NATIVE_COROUTINE`` flag, i.e. defined with the new ``async def``
syntax.
**The behavior of existing *generator-based coroutines* in asyncio
remains unchanged.**
Great effort has been made to make sure that coroutines and
generators are treated as distinct concepts:
1. *Native coroutine objects* do not implement ``__iter__`` and
``__next__`` methods. Therefore, they cannot be iterated over or
passed to ``iter()``, ``list()``, ``tuple()`` and other built-ins.
They also cannot be used in a ``for..in`` loop.
An attempt to use ``__iter__`` or ``__next__`` on a *native
coroutine object* will result in a ``TypeError``.
2. *Plain generators* cannot ``yield from`` *native coroutine objects*:
doing so will result in a ``TypeError``.
3. *generator-based coroutines* (for asyncio code must be decorated
with ``(a)asyncio.coroutine``) can ``yield from`` *native coroutine
objects*.
4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()``
return ``False`` for *native coroutine objects* and *native
coroutine functions*.
Coroutine object methods
''''''''''''''''''''''''
Coroutines are based on generators internally, thus they share the
implementation. Similarly to generator objects, coroutine objects have
``throw()``, ``send()`` and ``close()`` methods. ``StopIteration`` and
``GeneratorExit`` play the same role for coroutine objects (although
PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380,
and Python Documentation [11]_ for details.
``throw()``, ``send()`` methods for coroutine objects are used to push
values and raise errors into *Future-like* objects.
Debugging Features
------------------
A common beginner mistake is forgetting to use ``yield from`` on
coroutines::
@asyncio.coroutine
def useful():
asyncio.sleep(1) # this will do noting without 'yield from'
For debugging this kind of mistakes there is a special debug mode in
asyncio, in which ``@coroutine`` decorator wraps all functions with a
special object with a destructor logging a warning. Whenever a wrapped
generator gets garbage collected, a detailed logging message is
generated with information about where exactly the decorator function
was defined, stack trace of where it was collected, etc. Wrapper
object also provides a convenient ``__repr__`` function with detailed
information about the generator.
The only problem is how to enable these debug capabilities. Since
debug facilities should be a no-op in production mode, ``@coroutine``
decorator makes the decision of whether to wrap or not to wrap based on
an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is
possible to run asyncio programs with asyncio's own functions
instrumented. ``EventLoop.set_debug``, a different debug facility, has
no impact on ``@coroutine`` decorator's behavior.
With this proposal, coroutines is a native, distinct from generators,
concept. New methods ``set_coroutine_wrapper`` and
``get_coroutine_wrapper`` are added to the ``sys`` module, with which
frameworks can provide advanced debugging facilities.
It is also important to make coroutines as fast and efficient as
possible, therefore there are no debug features enabled by default.
Example::
async def debug_me():
await asyncio.sleep(1)
def async_debug_wrap(generator):
return asyncio.CoroWrapper(generator)
sys.set_coroutine_wrapper(async_debug_wrap)
debug_me() # <- this line will likely GC the coroutine object and
# trigger asyncio.CoroWrapper's code.
assert isinstance(debug_me(), asyncio.CoroWrapper)
sys.set_coroutine_wrapper(None) # <- this unsets any
# previously set wrapper
assert not isinstance(debug_me(), asyncio.CoroWrapper)
New Standard Library Functions
------------------------------
* ``types.coroutine(gen)``. See `types.coroutine()`_ section for
details.
* ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a
*coroutine object*.
* ``inspect.iscoroutinefunction(obj)`` returns ``True`` if ``obj`` is a
*coroutine function*.
* ``inspect.isawaitable(obj)`` returns ``True`` if ``obj`` can be used
in ``await`` expression. See `Await Expression`_ for details.
* ``sys.set_coroutine_wrapper(wraper)`` allows to intercept creation of
*coroutine objects*. ``wraper`` must be a callable that accepts one
argument: a *coroutine object* or ``None``. ``None`` resets the
wrapper. If called twice, the new wrapper replaces the previous one.
See `Debugging Features`_ for more details.
* ``sys.get_coroutine_wrapper()`` returns the current wrapper object.
Returns ``None`` if no wrapper was set. See `Debugging Features`_
for more details.
Glossary
========
:Native coroutine:
A coroutine function is declared with ``async def``. It uses
``await`` and ``return value``; see `New Coroutine Declaration
Syntax`_ for details.
:Native coroutine object:
Returned from a native coroutine function. See `Await Expression`_
for details.
:Generator-based coroutine:
Coroutines based on generator syntax. Most common example are
functions decorated with ``(a)asyncio.coroutine``.
:Generator-based coroutine object:
Returned from a generator-based coroutine function.
:Coroutine:
Either *native coroutine* or *generator-based coroutine*.
:Coroutine object:
Either *native coroutine object* or *generator-based coroutine
object*.
:Future-like object:
An object with an ``__await__`` method, or a C object with
``tp_await`` function, returning an iterator. Can be consumed by
an ``await`` expression in a coroutine. A coroutine waiting for a
Future-like object is suspended until the Future-like object's
``__await__`` completes, and returns the result. See `Await
Expression`_ for details.
:Awaitable:
A *Future-like* object or a *coroutine object*. See `Await
Expression`_ for details.
:Asynchronous context manager:
An asynchronous context manager has ``__aenter__`` and ``__aexit__``
methods and can be used with ``async with``. See `Asynchronous
Context Managers and "async with"`_ for details.
:Asynchronous iterable:
An object with an ``__aiter__`` method, which must return an
*asynchronous iterator* object. Can be used with ``async for``.
See `Asynchronous Iterators and "async for"`_ for details.
:Asynchronous iterator:
An asynchronous iterator has an ``__anext__`` method. See
`Asynchronous Iterators and "async for"`_ for details.
List of functions and methods
=============================
================= =================================== =================
Method Can contain Can't contain
================= =================================== =================
async def func await, return value yield, yield from
async def __a*__ await, return value yield, yield from
def __a*__ return awaitable await
def __await__ yield, yield from, return iterable await
generator yield, yield from, return value await
================= =================================== =================
Where:
* "async def func": native coroutine;
* "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
``__aexit__`` defined with the ``async`` keyword;
* "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
``__aexit__`` defined without the ``async`` keyword, must return an
*awaitable*;
* "def __await__": ``__await__`` method to implement *Future-like*
objects;
* generator: a "regular" generator, function defined with ``def`` and
which contains a least one ``yield`` or ``yield from`` expression.
Transition Plan
===============
To avoid backwards compatibility issues with ``async`` and ``await``
keywords, it was decided to modify ``tokenizer.c`` in such a way, that
it:
* recognizes ``async def`` name tokens combination (start of a
native coroutine);
* keeps track of regular functions and native coroutines;
* replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with
``AWAIT`` when in the process of yielding tokens for native
coroutines.
This approach allows for seamless combination of new syntax features
(all of them available only in ``async`` functions) with any existing
code.
An example of having "async def" and "async" attribute in one piece of
code::
class Spam:
async = 42
async def ham():
print(getattr(Spam, 'async'))
# The coroutine can be executed and will print '42'
Backwards Compatibility
-----------------------
This proposal preserves 100% backwards compatibility.
asyncio
-------
``asyncio`` module was adapted and tested to work with coroutines and
new statements. Backwards compatibility is 100% preserved, i.e. all
existing code will work as-is.
The required changes are mainly:
1. Modify ``(a)asyncio.coroutine`` decorator to use new
``types.coroutine()`` function.
2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
3. Add ``ensure_task()`` as an alias for ``async()`` function.
Deprecate ``async()`` function.
Migration strategy
''''''''''''''''''
Because *plain generators* cannot ``yield from`` *native coroutine
objects* (see `Differences from generators`_ section for more details),
it is advised to make sure that all generator-based coroutines are
decorated with ``(a)asyncio.coroutine`` *before* starting to use the new
syntax.
Grammar Updates
---------------
Grammar changes are also fairly minimal::
decorated: decorators (classdef | funcdef | async_funcdef)
async_funcdef: ASYNC funcdef
compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt
| funcdef | classdef | decorated | async_stmt)
async_stmt: ASYNC (funcdef | with_stmt | for_stmt)
power: atom_expr ['**' factor]
atom_expr: [AWAIT] atom trailer*
Transition Period Shortcomings
------------------------------
There is just one.
Until ``async`` and ``await`` are not proper keywords, it is not
possible (or at least very hard) to fix ``tokenizer.c`` to recognize
them on the **same line** with ``def`` keyword::
# async and await will always be parsed as variables
async def outer(): # 1
def nested(a=(await fut)):
pass
async def foo(): return (await fut) # 2
Since ``await`` and ``async`` in such cases are parsed as ``NAME``
tokens, a ``SyntaxError`` will be raised.
To workaround these issues, the above examples can be easily rewritten
to a more readable form::
async def outer(): # 1
a_default = await fut
def nested(a=a_default):
pass
async def foo(): # 2
return (await fut)
This limitation will go away as soon as ``async`` and ``await`` are
proper keywords. Or if it's decided to use a future import for this
PEP.
Deprecation Plans
-----------------
``async`` and ``await`` names will be softly deprecated in CPython 3.5
and 3.6. In 3.7 we will transform them to proper keywords. Making
``async`` and ``await`` proper keywords before 3.7 might make it harder
for people to port their code to Python 3.
Design Considerations
=====================
PEP 3152
--------
PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines
(called "cofunctions"). Some key points:
1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is
always a generator, even if there is no ``cocall`` expressions
inside it. Maps to ``async def`` in this proposal.
2. A new keyword ``cocall`` to call a *cofunction*. Can only be used
inside a *cofunction*. Maps to ``await`` in this proposal (with
some differences, see below.)
3. It is not possible to call a *cofunction* without a ``cocall``
keyword.
4. ``cocall`` grammatically requires parentheses after it::
atom: cocall | <existing alternatives for atom>
cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
cotrailer: '[' subscriptlist ']' | '.' NAME
5. ``cocall f(*args, **kwds)`` is semantically equivalent to
``yield from f.__cocall__(*args, **kwds)``.
Differences from this proposal:
1. There is no equivalent of ``__cocall__`` in this PEP, which is
called and its result is passed to ``yield from`` in the ``cocall``
expression. ``await`` keyword expects an *awaitable* object,
validates the type, and executes ``yield from`` on it. Although,
``__await__`` method is similar to ``__cocall__``, but is only used
to define *Future-like* objects.
2. ``await`` is defined in almost the same way as ``yield from`` in the
grammar (it is later enforced that ``await`` can only be inside
``async def``). It is possible to simply write ``await future``,
whereas ``cocall`` always requires parentheses.
3. To make asyncio work with PEP 3152 it would be required to modify
``(a)asyncio.coroutine`` decorator to wrap all functions in an object
with a ``__cocall__`` method, or to implement ``__cocall__`` on
generators. To call *cofunctions* from existing generator-based
coroutines it would be required to use ``costart(cofunc, *args,
**kwargs)`` built-in.
4. Since it is impossible to call a *cofunction* without a ``cocall``
keyword, it automatically prevents the common mistake of forgetting
to use ``yield from`` on generator-based coroutines. This proposal
addresses this problem with a different approach, see `Debugging
Features`_.
5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine
is that if is decided to implement coroutine-generators --
coroutines with ``yield`` or ``async yield`` expressions -- we
wouldn't need a ``cocall`` keyword to call them. So we'll end up
having ``__cocall__`` and no ``__call__`` for regular coroutines,
and having ``__call__`` and no ``__cocall__`` for coroutine-
generators.
6. Requiring parentheses grammatically also introduces a whole lot
of new problems.
The following code::
await fut
await function_returning_future()
await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2))
would look like::
cocall fut() # or cocall costart(fut)
cocall (function_returning_future())()
cocall asyncio.gather(costart(coro1, arg1, arg2),
costart(coro2, arg1, arg2))
7. There are no equivalents of ``async for`` and ``async with`` in PEP
3152.
Coroutine-generators
--------------------
With ``async for`` keyword it is desirable to have a concept of a
*coroutine-generator* -- a coroutine with ``yield`` and ``yield from``
expressions. To avoid any ambiguity with regular generators, we would
likely require to have an ``async`` keyword before ``yield``, and
``async yield from`` would raise a ``StopAsyncIteration`` exception.
While it is possible to implement coroutine-generators, we believe that
they are out of scope of this proposal. It is an advanced concept that
should be carefully considered and balanced, with a non-trivial changes
in the implementation of current generator objects. This is a matter
for a separate PEP.
Why "async" and "await" keywords
--------------------------------
async/await is not a new concept in programming languages:
* C# has it since long time ago [5]_;
* proposal to add async/await in ECMAScript 7 [2]_;
see also Traceur project [9]_;
* Facebook's Hack/HHVM [6]_;
* Google's Dart language [7]_;
* Scala [8]_;
* proposal to add async/await to C++ [10]_;
* and many other less popular languages.
This is a huge benefit, as some users already have experience with
async/await, and because it makes working with many languages in one
project easier (Python with ECMAScript 7 for instance).
Why "__aiter__" returns awaitable
---------------------------------
In principle, ``__aiter__`` could be a regular function. There are
several good reasons to make it a coroutine:
* as most of the ``__anext__``, ``__aenter__``, and ``__aexit__``
methods are coroutines, users would often make a mistake defining it
as ``async`` anyways;
* there might be a need to run some asynchronous operations in
``__aiter__``, for instance to prepare DB queries or do some file
operation.
Importance of "async" keyword
-----------------------------
While it is possible to just implement ``await`` expression and treat
all functions with at least one ``await`` as coroutines, this approach
makes APIs design, code refactoring and its long time support harder.
Let's pretend that Python only has ``await`` keyword::
def useful():
...
await log(...)
...
def important():
await useful()
If ``useful()`` function is refactored and someone removes all
``await`` expressions from it, it would become a regular python
function, and all code that depends on it, including ``important()``
would be broken. To mitigate this issue a decorator similar to
``(a)asyncio.coroutine`` has to be introduced.
Why "async def"
---------------
For some people bare ``async name(): pass`` syntax might look more
appealing than ``async def name(): pass``. It is certainly easier to
type. But on the other hand, it breaks the symmetry between ``async
def``, ``async with`` and ``async for``, where ``async`` is a modifier,
stating that the statement is asynchronous. It is also more consistent
with the existing grammar.
Why not "await for" and "await with"
------------------------------------
``async`` is an adjective, and hence it is a better choice for a
*statement qualifier* keyword. ``await for/with`` would imply that
something is awaiting for a completion of a ``for`` or ``with``
statement.
Why "async def" and not "def async"
-----------------------------------
``async`` keyword is a *statement qualifier*. A good analogy to it are
"static", "public", "unsafe" keywords from other languages. "async
for" is an asynchronous "for" statement, "async with" is an
asynchronous "with" statement, "async def" is an asynchronous function.
Having "async" after the main statement keyword might introduce some
confusion, like "for async item in iterator" can be read as "for each
asynchronous item in iterator".
Having ``async`` keyword before ``def``, ``with`` and ``for`` also
makes the language grammar simpler. And "async def" better separates
coroutines from regular functions visually.
Why not a __future__ import
---------------------------
``__future__`` imports are inconvenient and easy to forget to add.
Also, they are enabled for the whole source file. Consider that there
is a big project with a popular module named "async.py". With future
imports it is required to either import it using ``__import__()`` or
``importlib.import_module()`` calls, or to rename the module. The
proposed approach makes it possible to continue using old code and
modules without a hassle, while coming up with a migration plan for
future python versions.
Why magic methods start with "a"
--------------------------------
New asynchronous magic methods ``__aiter__``, ``__anext__``,
``__aenter__``, and ``__aexit__`` all start with the same prefix "a".
An alternative proposal is to use "async" prefix, so that ``__aiter__``
becomes ``__async_iter__``. However, to align new magic methods with
the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided
to use a shorter version.
Why not reuse existing magic names
----------------------------------
An alternative idea about new asynchronous iterators and context
managers was to reuse existing magic methods, by adding an ``async``
keyword to their declarations::
class CM:
async def __enter__(self): # instead of __aenter__
...
This approach has the following downsides:
* it would not be possible to create an object that works in both
``with`` and ``async with`` statements;
* it would break backwards compatibility, as nothing prohibits from
returning a Future-like objects from ``__enter__`` and/or
``__exit__`` in Python <= 3.4;
* one of the main points of this proposal is to make native coroutines
as simple and foolproof as possible, hence the clear separation of
the protocols.
Why not reuse existing "for" and "with" statements
--------------------------------------------------
The vision behind existing generator-based coroutines and this proposal
is to make it easy for users to see where the code might be suspended.
Making existing "for" and "with" statements to recognize asynchronous
iterators and context managers will inevitably create implicit suspend
points, making it harder to reason about the code.
Comprehensions
--------------
Syntax for asynchronous comprehensions could be provided, but this
construct is outside of the scope of this PEP.
Async lambda functions
----------------------
Syntax for asynchronous lambda functions could be provided, but this
construct is outside of the scope of this PEP.
Performance
===========
Overall Impact
--------------
This proposal introduces no observable performance impact. Here is an
output of python's official set of benchmarks [4]_:
::
python perf.py -r -b default ../cpython/python.exe
../cpython-aw/python.exe
[skipped]
Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0:
Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64
x86_64 i386
Total CPU cores: 8
### etree_iterparse ###
Min: 0.365359 -> 0.349168: 1.05x faster
Avg: 0.396924 -> 0.379735: 1.05x faster
Significant (t=9.71)
Stddev: 0.01225 -> 0.01277: 1.0423x larger
The following not significant results are hidden, use -v to show them:
django_v2, 2to3, etree_generate, etree_parse, etree_process,
fastpickle,
fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.
Tokenizer modifications
-----------------------
There is no observable slowdown of parsing python files with the
modified tokenizer: parsing of one 12Mb file
(``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount
of time.
async/await
-----------
The following micro-benchmark was used to determine performance
difference between "async" functions and generators::
import sys
import time
def binary(n):
if n <= 0:
return 1
l = yield from binary(n - 1)
r = yield from binary(n - 1)
return l + 1 + r
async def abinary(n):
if n <= 0:
return 1
l = await abinary(n - 1)
r = await abinary(n - 1)
return l + 1 + r
def timeit(gen, depth, repeat):
t0 = time.time()
for _ in range(repeat):
list(gen(depth))
t1 = time.time()
print('{}({}) * {}: total {:.3f}s'.format(
gen.__name__, depth, repeat, t1-t0))
The result is that there is no observable performance difference.
Minimum timing of 3 runs
::
abinary(19) * 30: total 12.985s
binary(19) * 30: total 12.953s
Note that depth of 19 means 1,048,575 calls.
Reference Implementation
========================
The reference implementation can be found here: [3]_.
List of high-level changes and new protocols
--------------------------------------------
1. New syntax for defining coroutines: ``async def`` and new ``await``
keyword.
2. New ``__await__`` method for Future-like objects, and new
``tp_await`` slot in ``PyTypeObject``.
3. New syntax for asynchronous context managers: ``async with``. And
associated protocol with ``__aenter__`` and ``__aexit__`` methods.
4. New syntax for asynchronous iteration: ``async for``. And
associated protocol with ``__aiter__``, ``__aexit__`` and new built-
in exception ``StopAsyncIteration``.
5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``,
``Await``.
6. New functions: ``sys.set_coroutine_wrapper(callback)``,
``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``,
``inspect.iscoroutinefunction()``, ``inspect.iscoroutine()``,
and ``inspect.isawaitable()``.
7. New ``CO_COROUTINE`` and ``CO_NATIVE_COROUTINE`` bit flags for code
objects.
While the list of changes and new things is not short, it is important
to understand, that most users will not use these features directly.
It is intended to be used in frameworks and libraries to provide users
with convenient to use and unambiguous APIs with ``async def``,
``await``, ``async for`` and ``async with`` syntax.
Working example
---------------
All concepts proposed in this PEP are implemented [3]_ and can be
tested.
::
import asyncio
async def echo_server():
print('Serving on localhost:8000')
await asyncio.start_server(handle_connection,
'localhost', 8000)
async def handle_connection(reader, writer):
print('New connection...')
while True:
data = await reader.read(8192)
if not data:
break
print('Sending {:.10}... back'.format(repr(data)))
writer.write(data)
loop = asyncio.get_event_loop()
loop.run_until_complete(echo_server())
try:
loop.run_forever()
finally:
loop.close()
References
==========
.. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
.. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions
.. [3] https://github.com/1st1/cpython/tree/await
.. [4] https://hg.python.org/benchmarks
.. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx
.. [6] http://docs.hhvm.com/manual/en/hack.async.php
.. [7] https://www.dartlang.org/articles/await-async/
.. [8] http://docs.scala-lang.org/sips/pending/async.html
.. [9]
https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-func…
.. [10]
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)
.. [11]
https://docs.python.org/3/reference/expressions.html#generator-iterator-met…
.. [12] https://docs.python.org/3/reference/expressions.html#primaries
Acknowledgments
===============
I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew
Svetlov, and Łukasz Langa for their initial feedback.
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:
[View Less]
18
59
For those interested in tracking the history of generators and coroutines
in Python, I just found out that PEP 342
<https://www.python.org/dev/peps/pep-0342/> (which introduced
send/throw/close and made "generators as coroutines" a mainstream Python
concept) harks back to PEP 288 <https://www.python.org/dev/peps/pep-0288/>,
which was rejected. PEP 288 also proposed some changes to generators. The
interesting bit though is in the references: there are two links to old
articles by …
[View More]David Mertz that describe using generators in state machines
and other interesting and unconventional applications of generators. All
these well predated PEP 342, so yield was a statement and could not receive
a value from the function calling next() -- communication was through a
shared class instance.
http://gnosis.cx/publish/programming/charming_python_b5.txt
http://gnosis.cx/publish/programming/charming_python_b7.txt
Enjoy!
--
--Guido van Rossum (python.org/~guido)
[View Less]
3
2
Everybody,
In order to save myself a major headache I'm hereby accepting PEP 492.
I've been following Yury's efforts carefully and I am fully confident that
we're doing the right thing here. There is only so much effort we can put
into clarifying terminology and explaining coroutines. Somebody should
write a tutorial. (I started to write one, but I ran out of time after just
describing basic yield.)
I've given Yury clear instructions to focus on how to proceed -- he's to
work with another …
[View More]core dev on getting the implementation ready in time for
beta 1 (scheduled for May 24, but I think the target date should be May 19).
The acceptance is provisional in the PEP 411 sense (stretching its meaning
to apply to language changes). That is, we reserve the right to change the
specification (or even withdraw it, in a worst-case scenario) until 3.6,
although I expect we won't need to do this except for some peripheral
issues (e.g. the backward compatibility flags).
I now plan to go back to PEP 484 (type hints). Fortunately in that case
there's not much *implementation* that will land (just the typing.py
module), but there's still a lot of language in the PEP that needs updating
(check the PEP 484 tracker <https://github.com/ambv/typehinting/issues>).
--
--Guido van Rossum (python.org/~guido)
[View Less]
8
8
Hi python-dev,
Another round of updates. Reference implementation
has been updated: https://github.com/1st1/cpython/tree/await
(includes all things from the below summary of updates +
tests).
Summary:
1. "PyTypeObject.tp_await" slot. Replaces "tp_reserved".
This is to enable implementation of Futures with C API.
Must return an iterator if implemented.
2. New grammar for "await" expressions, see
'Syntax of "await" expression' section
3. inspect.iscoroutine() and inspect.…
[View More]iscoroutineobjects()
functions.
4. Full separation of coroutines and generators.
This is a big one; let's discuss.
a) Coroutine objects raise TypeError (is NotImplementedError
better?) in their __iter__ and __next__. Therefore it's
not not possible to pass them to iter(), tuple(), next() and
other similar functions that work with iterables.
b) Because of (a), for..in iteration also does not work
on coroutines anymore.
c) 'yield from' only accept coroutine objects from
generators decorated with 'types.coroutine'. That means
that existing asyncio generator-based coroutines will
happily yield from both coroutines and generators.
*But* every generator-based coroutine *must* be
decorated with `asyncio.coroutine()`. This is
potentially a backwards incompatible change.
d) inspect.isgenerator() and inspect.isgeneratorfunction()
return `False` for coroutine objects & coroutine functions.
e) Should we add a coroutine ABC (for cython etc)?
I, personally, think this is highly necessary. First,
separation of coroutines from generators is extremely
important. One day there won't be generator-based
coroutines, and we want to avoid any kind of confusion.
Second, we only can do this in 3.5. This kind of
semantics change won't be ever possible.
asyncio recommends using @coroutine decorator, and most
projects that I've seen do use it. Also there is no
reason for people to use iter() and next() functions
on coroutines when writing asyncio code. I doubt that
this will cause serious backwards compatibility problems
(asyncio also has provisional status).
Thank you,
Yury
PEP: 492
Title: Coroutines with async and await syntax
Version: $Revision$
Last-Modified: $Date$
Author: Yury Selivanov <yselivanov(a)sprymix.com>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 09-Apr-2015
Python-Version: 3.5
Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015
Abstract
========
This PEP introduces new syntax for coroutines, asynchronous ``with``
statements and ``for`` loops. The main motivation behind this proposal
is to streamline writing and maintaining asynchronous code, as well as
to simplify previously hard to implement code patterns.
Rationale and Goals
===================
Current Python supports implementing coroutines via generators (PEP
342), further enhanced by the ``yield from`` syntax introduced in PEP
380. This approach has a number of shortcomings:
* it is easy to confuse coroutines with regular generators, since they
share the same syntax; async libraries often attempt to alleviate
this by using decorators (e.g. ``(a)asyncio.coroutine`` [1]_);
* it is not possible to natively define a coroutine which has no
``yield`` or ``yield from`` statements, again requiring the use of
decorators to fix potential refactoring issues;
* support for asynchronous calls is limited to expressions where
``yield`` is allowed syntactically, limiting the usefulness of
syntactic features, such as ``with`` and ``for`` statements.
This proposal makes coroutines a native Python language feature, and
clearly separates them from generators. This removes
generator/coroutine ambiguity, and makes it possible to reliably define
coroutines without reliance on a specific library. This also enables
linters and IDEs to improve static code analysis and refactoring.
Native coroutines and the associated new syntax features make it
possible to define context manager and iteration protocols in
asynchronous terms. As shown later in this proposal, the new ``async
with`` statement lets Python programs perform asynchronous calls when
entering and exiting a runtime context, and the new ``async for``
statement makes it possible to perform asynchronous calls in iterators.
Specification
=============
This proposal introduces new syntax and semantics to enhance coroutine
support in Python, it does not change the internal implementation of
coroutines, which are still based on generators.
It is strongly suggested that the reader understands how coroutines are
implemented in Python (PEP 342 and PEP 380). It is also recommended to
read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions).
From this point in this document we use the word *coroutine* to refer
to functions declared using the new syntax. *generator-based
coroutine* is used where necessary to refer to coroutines that are
based on generator syntax.
New Coroutine Declaration Syntax
--------------------------------
The following new syntax is used to declare a coroutine::
async def read_data(db):
pass
Key properties of coroutines:
* ``async def`` functions are always coroutines, even if they do not
contain ``await`` expressions.
* It is a ``SyntaxError`` to have ``yield`` or ``yield from``
expressions in an ``async`` function.
* Internally, a new code object flag - ``CO_COROUTINE`` - is introduced
to enable runtime detection of coroutines (and migrating existing
code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR``
flags set.
* Regular generators, when called, return a *generator object*;
similarly, coroutines return a *coroutine object*.
* ``StopIteration`` exceptions are not propagated out of coroutines,
and are replaced with a ``RuntimeError``. For regular generators
such behavior requires a future import (see PEP 479).
types.coroutine()
-----------------
A new function ``coroutine(gen)`` is added to the ``types`` module. It
applies ``CO_COROUTINE`` flag to the passed generator-function's code
object, making it to return a *coroutine object* when called.
This feature enables an easy upgrade path for existing libraries.
Await Expression
----------------
The following new ``await`` expression is used to obtain a result of
coroutine execution::
async def read_data(db):
data = await db.fetch('SELECT ...')
...
``await``, similarly to ``yield from``, suspends execution of
``read_data`` coroutine until ``db.fetch`` *awaitable* completes and
returns the result data.
It uses the ``yield from`` implementation with an extra step of
validating its argument. ``await`` only accepts an *awaitable*, which
can be one of:
* A *coroutine object* returned from a *coroutine* or a generator
decorated with ``types.coroutine()``.
* An object with an ``__await__`` method returning an iterator.
Any ``yield from`` chain of calls ends with a ``yield``. This is a
fundamental mechanism of how *Futures* are implemented. Since,
internally, coroutines are a special kind of generators, every
``await`` is suspended by a ``yield`` somewhere down the chain of
``await`` calls (please refer to PEP 3156 for a detailed
explanation.)
To enable this behavior for coroutines, a new magic method called
``__await__`` is added. In asyncio, for instance, to enable Future
objects in ``await`` statements, the only change is to add
``__await__ = __iter__`` line to ``asyncio.Future`` class.
Objects with ``__await__`` method are called *Future-like* objects in
the rest of this PEP.
Also, please note that ``__aiter__`` method (see its definition
below) cannot be used for this purpose. It is a different protocol,
and would be like using ``__iter__`` instead of ``__call__`` for
regular callables.
It is a ``TypeError`` if ``__await__`` returns anything but an
iterator.
* Objects defined with CPython C API with a ``tp_await`` function,
returning an iterator (similar to ``__await__`` method).
It is a ``SyntaxError`` to use ``await`` outside of a coroutine.
It is a ``TypeError`` to pass anything other than an *awaitable* object
to an ``await`` expression.
Syntax of "await" expression
''''''''''''''''''''''''''''
``await`` keyword is defined differently from ``yield`` and ``yield
from``. The main difference is that *await expressions* do not require
parentheses around them most of the times.
Examples::
================================== ==================================
Expression Will be parsed as
================================== ==================================
``if await fut: pass`` ``if (await fut): pass``
``if await fut + 1: pass`` ``if (await fut) + 1: pass``
``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'``
``with await fut, open(): pass`` ``with (await fut), open(): pass``
``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )``
``return await coro()`` ``return ( await coro() )``
``res = await coro() ** 2`` ``res = (await coro()) ** 2``
``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)``
================================== ==================================
See `Grammar Updates`_ section for details.
Asynchronous Context Managers and "async with"
----------------------------------------------
An *asynchronous context manager* is a context manager that is able to
suspend execution in its *enter* and *exit* methods.
To make this possible, a new protocol for asynchronous context managers
is proposed. Two new magic methods are added: ``__aenter__`` and
``__aexit__``. Both must return an *awaitable*.
An example of an asynchronous context manager::
class AsyncContextManager:
async def __aenter__(self):
await log('entering context')
async def __aexit__(self, exc_type, exc, tb):
await log('exiting context')
New Syntax
''''''''''
A new statement for asynchronous context managers is proposed::
async with EXPR as VAR:
BLOCK
which is semantically equivalent to::
mgr = (EXPR)
aexit = type(mgr).__aexit__
aenter = type(mgr).__aenter__(mgr)
exc = True
try:
try:
VAR = await aenter
BLOCK
except:
exc = False
exit_res = await aexit(mgr, *sys.exc_info())
if not exit_res:
raise
finally:
if exc:
await aexit(mgr, None, None, None)
As with regular ``with`` statements, it is possible to specify multiple
context managers in a single ``async with`` statement.
It is an error to pass a regular context manager without ``__aenter__``
and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError``
to use ``async with`` outside of a coroutine.
Example
'''''''
With asynchronous context managers it is easy to implement proper
database transaction managers for coroutines::
async def commit(session, data):
...
async with session.transaction():
...
await session.update(data)
...
Code that needs locking also looks lighter::
async with lock:
...
instead of::
with (yield from lock):
...
Asynchronous Iterators and "async for"
--------------------------------------
An *asynchronous iterable* is able to call asynchronous code in its
*iter* implementation, and *asynchronous iterator* can call
asynchronous code in its *next* method. To support asynchronous
iteration:
1. An object must implement an ``__aiter__`` method returning an
*awaitable* resulting in an *asynchronous iterator object*.
2. An *asynchronous iterator object* must implement an ``__anext__``
method returning an *awaitable*.
3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
exception.
An example of asynchronous iterable::
class AsyncIterable:
async def __aiter__(self):
return self
async def __anext__(self):
data = await self.fetch_data()
if data:
return data
else:
raise StopAsyncIteration
async def fetch_data(self):
...
New Syntax
''''''''''
A new statement for iterating through asynchronous iterators is
proposed::
async for TARGET in ITER:
BLOCK
else:
BLOCK2
which is semantically equivalent to::
iter = (ITER)
iter = await type(iter).__aiter__(iter)
running = True
while running:
try:
TARGET = await type(iter).__anext__(iter)
except StopAsyncIteration:
running = False
else:
BLOCK
else:
BLOCK2
It is a ``TypeError`` to pass a regular iterable without ``__aiter__``
method to ``async for``. It is a ``SyntaxError`` to use ``async for``
outside of a coroutine.
As for with regular ``for`` statement, ``async for`` has an optional
``else`` clause.
Example 1
'''''''''
With asynchronous iteration protocol it is possible to asynchronously
buffer data during iteration::
async for data in cursor:
...
Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows
of data from a database after every ``N`` iterations.
The following code illustrates new asynchronous iteration protocol::
class Cursor:
def __init__(self):
self.buffer = collections.deque()
def _prefetch(self):
...
async def __aiter__(self):
return self
async def __anext__(self):
if not self.buffer:
self.buffer = await self._prefetch()
if not self.buffer:
raise StopAsyncIteration
return self.buffer.popleft()
then the ``Cursor`` class can be used as follows::
async for row in Cursor():
print(row)
which would be equivalent to the following code::
i = await Cursor().__aiter__()
while True:
try:
row = await i.__anext__()
except StopAsyncIteration:
break
else:
print(row)
Example 2
'''''''''
The following is a utility class that transforms a regular iterable to
an asynchronous one. While this is not a very useful thing to do, the
code illustrates the relationship between regular and asynchronous
iterators.
::
class AsyncIteratorWrapper:
def __init__(self, obj):
self._it = iter(obj)
async def __aiter__(self):
return self
async def __anext__(self):
try:
value = next(self._it)
except StopIteration:
raise StopAsyncIteration
return value
async for letter in AsyncIteratorWrapper("abc"):
print(letter)
Why StopAsyncIteration?
'''''''''''''''''''''''
Coroutines are still based on generators internally. So, before PEP
479, there was no fundamental difference between
::
def g1():
yield from fut
return 'spam'
and
::
def g2():
yield from fut
raise StopIteration('spam')
And since PEP 479 is accepted and enabled by default for coroutines,
the following example will have its ``StopIteration`` wrapped into a
``RuntimeError``
::
async def a1():
await fut
raise StopIteration('spam')
The only way to tell the outside code that the iteration has ended is
to raise something other than ``StopIteration``. Therefore, a new
built-in exception class ``StopAsyncIteration`` was added.
Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions
raised in coroutines are wrapped in ``RuntimeError``.
Debugging Features
------------------
One of the most frequent mistakes that people make when using
generators as coroutines is forgetting to use ``yield from``::
@asyncio.coroutine
def useful():
asyncio.sleep(1) # this will do noting without 'yield from'
For debugging this kind of mistakes there is a special debug mode in
asyncio, in which ``@coroutine`` decorator wraps all functions with a
special object with a destructor logging a warning. Whenever a wrapped
generator gets garbage collected, a detailed logging message is
generated with information about where exactly the decorator function
was defined, stack trace of where it was collected, etc. Wrapper
object also provides a convenient ``__repr__`` function with detailed
information about the generator.
The only problem is how to enable these debug capabilities. Since
debug facilities should be a no-op in production mode, ``@coroutine``
decorator makes the decision of whether to wrap or not to wrap based on
an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is
possible to run asyncio programs with asyncio's own functions
instrumented. ``EventLoop.set_debug``, a different debug facility, has
no impact on ``@coroutine`` decorator's behavior.
With this proposal, coroutines is a native, distinct from generators,
concept. New methods ``set_coroutine_wrapper`` and
``get_coroutine_wrapper`` are added to the ``sys`` module, with which
frameworks can provide advanced debugging facilities.
It is also important to make coroutines as fast and efficient as
possible, therefore there are no debug features enabled by default.
Example::
async def debug_me():
await asyncio.sleep(1)
def async_debug_wrap(generator):
return asyncio.CoroWrapper(generator)
sys.set_coroutine_wrapper(async_debug_wrap)
debug_me() # <- this line will likely GC the coroutine object and
# trigger asyncio.CoroWrapper's code.
assert isinstance(debug_me(), asyncio.CoroWrapper)
sys.set_coroutine_wrapper(None) # <- this unsets any
# previously set wrapper
assert not isinstance(debug_me(), asyncio.CoroWrapper)
If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper
replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)``
unsets the wrapper.
inspect.iscoroutine() and inspect.iscoroutineobject()
-----------------------------------------------------
Two new functions are added to the ``inspect`` module:
* ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a
coroutine object.
* ``inspect.iscoroutinefunction(obj)`` returns ``True`` is ``obj`` is a
coroutine function.
Differences between coroutines and generators
---------------------------------------------
A great effort has been made to make sure that coroutines and
generators are separate concepts:
1. Coroutine objects do not implement ``__iter__`` and ``__next__``
methods. Therefore they cannot be iterated over or passed to
``iter()``, ``list()``, ``tuple()`` and other built-ins. They
also cannot be used in a ``for..in`` loop.
2. ``yield from`` does not accept coroutine objects (unless it is used
in a generator-based coroutine decorated with ``types.coroutine``.)
3. ``yield from`` does not accept coroutine objects from plain Python
generators (*not* generator-based coroutines.)
4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()``
return ``False`` for coroutine objects and coroutine functions.
Coroutine objects
-----------------
Coroutines are based on generators internally, thus they share the
implementation. Similarly to generator objects, coroutine objects have
``throw``, ``send`` and ``close`` methods. ``StopIteration`` and
``GeneratorExit`` play the same role for coroutine objects (although
PEP 479 is enabled by default for coroutines).
Glossary
========
:Coroutine:
A coroutine function, or just "coroutine", is declared with ``async
def``. It uses ``await`` and ``return value``; see `New Coroutine
Declaration Syntax`_ for details.
:Coroutine object:
Returned from a coroutine function. See `Await Expression`_ for
details.
:Future-like object:
An object with an ``__await__`` method, or a C object with
``tp_await`` function, returning an iterator. Can be consumed by
an ``await`` expression in a coroutine. A coroutine waiting for a
Future-like object is suspended until the Future-like object's
``__await__`` completes, and returns the result. See `Await
Expression`_ for details.
:Awaitable:
A *Future-like* object or a *coroutine object*. See `Await
Expression`_ for details.
:Generator-based coroutine:
Coroutines based in generator syntax. Most common example is
``(a)asyncio.coroutine``.
:Asynchronous context manager:
An asynchronous context manager has ``__aenter__`` and ``__aexit__``
methods and can be used with ``async with``. See `Asynchronous
Context Managers and "async with"`_ for details.
:Asynchronous iterable:
An object with an ``__aiter__`` method, which must return an
*asynchronous iterator* object. Can be used with ``async for``.
See `Asynchronous Iterators and "async for"`_ for details.
:Asynchronous iterator:
An asynchronous iterator has an ``__anext__`` method. See
`Asynchronous Iterators and "async for"`_ for details.
List of functions and methods
=============================
================= =================================== =================
Method Can contain Can't contain
================= =================================== =================
async def func await, return value yield, yield from
async def __a*__ await, return value yield, yield from
def __a*__ return awaitable await
def __await__ yield, yield from, return iterable await
generator yield, yield from, return value await
================= =================================== =================
Where:
* "async def func": coroutine;
* "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
``__aexit__`` defined with the ``async`` keyword;
* "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
``__aexit__`` defined without the ``async`` keyword, must return an
*awaitable*;
* "def __await__": ``__await__`` method to implement *Future-like*
objects;
* generator: a "regular" generator, function defined with ``def`` and
which contains a least one ``yield`` or ``yield from`` expression.
Transition Plan
===============
To avoid backwards compatibility issues with ``async`` and ``await``
keywords, it was decided to modify ``tokenizer.c`` in such a way, that
it:
* recognizes ``async def`` name tokens combination (start of a
coroutine);
* keeps track of regular functions and coroutines;
* replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with
``AWAIT`` when in the process of yielding tokens for coroutines.
This approach allows for seamless combination of new syntax features
(all of them available only in ``async`` functions) with any existing
code.
An example of having "async def" and "async" attribute in one piece of
code::
class Spam:
async = 42
async def ham():
print(getattr(Spam, 'async'))
# The coroutine can be executed and will print '42'
Backwards Compatibility
-----------------------
This proposal preserves 100% backwards compatibility.
Grammar Updates
---------------
Grammar changes are also fairly minimal::
decorated: decorators (classdef | funcdef | async_funcdef)
async_funcdef: ASYNC funcdef
compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt
| funcdef | classdef | decorated | async_stmt)
async_stmt: ASYNC (funcdef | with_stmt | for_stmt)
power: atom_expr ['**' factor]
atom_expr: [AWAIT] atom trailer*
Transition Period Shortcomings
------------------------------
There is just one.
Until ``async`` and ``await`` are not proper keywords, it is not
possible (or at least very hard) to fix ``tokenizer.c`` to recognize
them on the **same line** with ``def`` keyword::
# async and await will always be parsed as variables
async def outer(): # 1
def nested(a=(await fut)):
pass
async def foo(): return (await fut) # 2
Since ``await`` and ``async`` in such cases are parsed as ``NAME``
tokens, a ``SyntaxError`` will be raised.
To workaround these issues, the above examples can be easily rewritten
to a more readable form::
async def outer(): # 1
a_default = await fut
def nested(a=a_default):
pass
async def foo(): # 2
return (await fut)
This limitation will go away as soon as ``async`` and ``await`` ate
proper keywords. Or if it's decided to use a future import for this
PEP.
Deprecation Plans
-----------------
``async`` and ``await`` names will be softly deprecated in CPython 3.5
and 3.6. In 3.7 we will transform them to proper keywords. Making
``async`` and ``await`` proper keywords before 3.7 might make it harder
for people to port their code to Python 3.
asyncio
-------
``asyncio`` module was adapted and tested to work with coroutines and
new statements. Backwards compatibility is 100% preserved.
The required changes are mainly:
1. Modify ``(a)asyncio.coroutine`` decorator to use new
``types.coroutine()`` function.
2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
3. Add ``ensure_task()`` as an alias for ``async()`` function.
Deprecate ``async()`` function.
Design Considerations
=====================
PEP 3152
--------
PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines
(called "cofunctions"). Some key points:
1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is
always a generator, even if there is no ``cocall`` expressions
inside it. Maps to ``async def`` in this proposal.
2. A new keyword ``cocall`` to call a *cofunction*. Can only be used
inside a *cofunction*. Maps to ``await`` in this proposal (with
some differences, see below.)
3. It is not possible to call a *cofunction* without a ``cocall``
keyword.
4. ``cocall`` grammatically requires parentheses after it::
atom: cocall | <existing alternatives for atom>
cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
cotrailer: '[' subscriptlist ']' | '.' NAME
5. ``cocall f(*args, **kwds)`` is semantically equivalent to
``yield from f.__cocall__(*args, **kwds)``.
Differences from this proposal:
1. There is no equivalent of ``__cocall__`` in this PEP, which is
called and its result is passed to ``yield from`` in the ``cocall``
expression. ``await`` keyword expects an *awaitable* object,
validates the type, and executes ``yield from`` on it. Although,
``__await__`` method is similar to ``__cocall__``, but is only used
to define *Future-like* objects.
2. ``await`` is defined in almost the same way as ``yield from`` in the
grammar (it is later enforced that ``await`` can only be inside
``async def``). It is possible to simply write ``await future``,
whereas ``cocall`` always requires parentheses.
3. To make asyncio work with PEP 3152 it would be required to modify
``(a)asyncio.coroutine`` decorator to wrap all functions in an object
with a ``__cocall__`` method, or to implement ``__cocall__`` on
generators. To call *cofunctions* from existing generator-based
coroutines it would be required to use ``costart(cofunc, *args,
**kwargs)`` built-in.
4. Since it is impossible to call a *cofunction* without a ``cocall``
keyword, it automatically prevents the common mistake of forgetting
to use ``yield from`` on generator-based coroutines. This proposal
addresses this problem with a different approach, see `Debugging
Features`_.
5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine
is that if is decided to implement coroutine-generators --
coroutines with ``yield`` or ``async yield`` expressions -- we
wouldn't need a ``cocall`` keyword to call them. So we'll end up
having ``__cocall__`` and no ``__call__`` for regular coroutines,
and having ``__call__`` and no ``__cocall__`` for coroutine-
generators.
6. Requiring parentheses grammatically also introduces a whole lot
of new problems.
The following code::
await fut
await function_returning_future()
await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2))
would look like::
cocall fut() # or cocall costart(fut)
cocall (function_returning_future())()
cocall asyncio.gather(costart(coro1, arg1, arg2),
costart(coro2, arg1, arg2))
7. There are no equivalents of ``async for`` and ``async with`` in PEP
3152.
Coroutine-generators
--------------------
With ``async for`` keyword it is desirable to have a concept of a
*coroutine-generator* -- a coroutine with ``yield`` and ``yield from``
expressions. To avoid any ambiguity with regular generators, we would
likely require to have an ``async`` keyword before ``yield``, and
``async yield from`` would raise a ``StopAsyncIteration`` exception.
While it is possible to implement coroutine-generators, we believe that
they are out of scope of this proposal. It is an advanced concept that
should be carefully considered and balanced, with a non-trivial changes
in the implementation of current generator objects. This is a matter
for a separate PEP.
No implicit wrapping in Futures
-------------------------------
There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A
key difference is that JavaScript "async functions" always return a
Promise. While this approach has some advantages, it also implies that
a new Promise object is created on each "async function" invocation.
We could implement a similar functionality in Python, by wrapping all
coroutines in a Future object, but this has the following
disadvantages:
1. Performance. A new Future object would be instantiated on each
coroutine call. Moreover, this makes implementation of ``await``
expressions slower (disabling optimizations of ``yield from``).
2. A new built-in ``Future`` object would need to be added.
3. Coming up with a generic ``Future`` interface that is usable for any
use case in any framework is a very hard to solve problem.
4. It is not a feature that is used frequently, when most of the code
is coroutines.
Why "async" and "await" keywords
--------------------------------
async/await is not a new concept in programming languages:
* C# has it since long time ago [5]_;
* proposal to add async/await in ECMAScript 7 [2]_;
see also Traceur project [9]_;
* Facebook's Hack/HHVM [6]_;
* Google's Dart language [7]_;
* Scala [8]_;
* proposal to add async/await to C++ [10]_;
* and many other less popular languages.
This is a huge benefit, as some users already have experience with
async/await, and because it makes working with many languages in one
project easier (Python with ECMAScript 7 for instance).
Why "__aiter__" is a coroutine
------------------------------
In principle, ``__aiter__`` could be a regular function. There are
several good reasons to make it a coroutine:
* as most of the ``__anext__``, ``__aenter__``, and ``__aexit__``
methods are coroutines, users would often make a mistake defining it
as ``async`` anyways;
* there might be a need to run some asynchronous operations in
``__aiter__``, for instance to prepare DB queries or do some file
operation.
Importance of "async" keyword
-----------------------------
While it is possible to just implement ``await`` expression and treat
all functions with at least one ``await`` as coroutines, this approach
makes APIs design, code refactoring and its long time support harder.
Let's pretend that Python only has ``await`` keyword::
def useful():
...
await log(...)
...
def important():
await useful()
If ``useful()`` function is refactored and someone removes all
``await`` expressions from it, it would become a regular python
function, and all code that depends on it, including ``important()``
would be broken. To mitigate this issue a decorator similar to
``(a)asyncio.coroutine`` has to be introduced.
Why "async def"
---------------
For some people bare ``async name(): pass`` syntax might look more
appealing than ``async def name(): pass``. It is certainly easier to
type. But on the other hand, it breaks the symmetry between ``async
def``, ``async with`` and ``async for``, where ``async`` is a modifier,
stating that the statement is asynchronous. It is also more consistent
with the existing grammar.
Why "async for/with" instead of "await for/with"
------------------------------------------------
``async`` is an adjective, and hence it is a better choice for a
*statement qualifier* keyword. ``await for/with`` would imply that
something is awaiting for a completion of a ``for`` or ``with``
statement.
Why "async def" and not "def async"
-----------------------------------
``async`` keyword is a *statement qualifier*. A good analogy to it are
"static", "public", "unsafe" keywords from other languages. "async
for" is an asynchronous "for" statement, "async with" is an
asynchronous "with" statement, "async def" is an asynchronous function.
Having "async" after the main statement keyword might introduce some
confusion, like "for async item in iterator" can be read as "for each
asynchronous item in iterator".
Having ``async`` keyword before ``def``, ``with`` and ``for`` also
makes the language grammar simpler. And "async def" better separates
coroutines from regular functions visually.
Why not a __future__ import
---------------------------
``__future__`` imports are inconvenient and easy to forget to add.
Also, they are enabled for the whole source file. Consider that there
is a big project with a popular module named "async.py". With future
imports it is required to either import it using ``__import__()`` or
``importlib.import_module()`` calls, or to rename the module. The
proposed approach makes it possible to continue using old code and
modules without a hassle, while coming up with a migration plan for
future python versions.
Why magic methods start with "a"
--------------------------------
New asynchronous magic methods ``__aiter__``, ``__anext__``,
``__aenter__``, and ``__aexit__`` all start with the same prefix "a".
An alternative proposal is to use "async" prefix, so that ``__aiter__``
becomes ``__async_iter__``. However, to align new magic methods with
the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided
to use a shorter version.
Why not reuse existing magic names
----------------------------------
An alternative idea about new asynchronous iterators and context
managers was to reuse existing magic methods, by adding an ``async``
keyword to their declarations::
class CM:
async def __enter__(self): # instead of __aenter__
...
This approach has the following downsides:
* it would not be possible to create an object that works in both
``with`` and ``async with`` statements;
* it would break backwards compatibility, as nothing prohibits from
returning a Future-like objects from ``__enter__`` and/or
``__exit__`` in Python <= 3.4;
* one of the main points of this proposal is to make coroutines as
simple and foolproof as possible, hence the clear separation of the
protocols.
Why not reuse existing "for" and "with" statements
--------------------------------------------------
The vision behind existing generator-based coroutines and this proposal
is to make it easy for users to see where the code might be suspended.
Making existing "for" and "with" statements to recognize asynchronous
iterators and context managers will inevitably create implicit suspend
points, making it harder to reason about the code.
Comprehensions
--------------
For the sake of restricting the broadness of this PEP there is no new
syntax for asynchronous comprehensions. This should be considered in a
separate PEP, if there is a strong demand for this feature.
Async lambdas
-------------
Lambda coroutines are not part of this proposal. In this proposal they
would look like ``async lambda(parameters): expression``. Unless there
is a strong demand to have them as part of this proposal, it is
recommended to consider them later in a separate PEP.
Performance
===========
Overall Impact
--------------
This proposal introduces no observable performance impact. Here is an
output of python's official set of benchmarks [4]_:
::
python perf.py -r -b default ../cpython/python.exe
../cpython-aw/python.exe
[skipped]
Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0:
Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64
x86_64 i386
Total CPU cores: 8
### etree_iterparse ###
Min: 0.365359 -> 0.349168: 1.05x faster
Avg: 0.396924 -> 0.379735: 1.05x faster
Significant (t=9.71)
Stddev: 0.01225 -> 0.01277: 1.0423x larger
The following not significant results are hidden, use -v to show them:
django_v2, 2to3, etree_generate, etree_parse, etree_process,
fastpickle,
fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.
Tokenizer modifications
-----------------------
There is no observable slowdown of parsing python files with the
modified tokenizer: parsing of one 12Mb file
(``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount
of time.
async/await
-----------
The following micro-benchmark was used to determine performance
difference between "async" functions and generators::
import sys
import time
def binary(n):
if n <= 0:
return 1
l = yield from binary(n - 1)
r = yield from binary(n - 1)
return l + 1 + r
async def abinary(n):
if n <= 0:
return 1
l = await abinary(n - 1)
r = await abinary(n - 1)
return l + 1 + r
def timeit(gen, depth, repeat):
t0 = time.time()
for _ in range(repeat):
list(gen(depth))
t1 = time.time()
print('{}({}) * {}: total {:.3f}s'.format(
gen.__name__, depth, repeat, t1-t0))
The result is that there is no observable performance difference.
Minimum timing of 3 runs
::
abinary(19) * 30: total 12.985s
binary(19) * 30: total 12.953s
Note that depth of 19 means 1,048,575 calls.
Reference Implementation
========================
The reference implementation can be found here: [3]_.
List of high-level changes and new protocols
--------------------------------------------
1. New syntax for defining coroutines: ``async def`` and new ``await``
keyword.
2. New ``__await__`` method for Future-like objects, and new
``tp_await`` slot in ``PyTypeObject``.
3. New syntax for asynchronous context managers: ``async with``. And
associated protocol with ``__aenter__`` and ``__aexit__`` methods.
4. New syntax for asynchronous iteration: ``async for``. And
associated protocol with ``__aiter__``, ``__aexit__`` and new built-
in exception ``StopAsyncIteration``.
5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``,
``Await``.
6. New functions: ``sys.set_coroutine_wrapper(callback)``,
``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``,
``inspect.iscoroutinefunction()``, and ``inspect.iscoroutine()``.
7. New ``CO_COROUTINE`` bit flag for code objects.
While the list of changes and new things is not short, it is important
to understand, that most users will not use these features directly.
It is intended to be used in frameworks and libraries to provide users
with convenient to use and unambiguous APIs with ``async def``,
``await``, ``async for`` and ``async with`` syntax.
Working example
---------------
All concepts proposed in this PEP are implemented [3]_ and can be
tested.
::
import asyncio
async def echo_server():
print('Serving on localhost:8000')
await asyncio.start_server(handle_connection,
'localhost', 8000)
async def handle_connection(reader, writer):
print('New connection...')
while True:
data = await reader.read(8192)
if not data:
break
print('Sending {:.10}... back'.format(repr(data)))
writer.write(data)
loop = asyncio.get_event_loop()
loop.run_until_complete(echo_server())
try:
loop.run_forever()
finally:
loop.close()
References
==========
.. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
.. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions
.. [3] https://github.com/1st1/cpython/tree/await
.. [4] https://hg.python.org/benchmarks
.. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx
.. [6] http://docs.hhvm.com/manual/en/hack.async.php
.. [7] https://www.dartlang.org/articles/await-async/
.. [8] http://docs.scala-lang.org/sips/pending/async.html
.. [9]
https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-func…
.. [10]
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)
Acknowledgments
===============
I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew
Svetlov, and Łukasz Langa for their initial feedback.
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:
[View Less]
16
72
Hello.
In this post
<http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path…>,
I have noticed a problem with the following code.
from pathlib import Path
> class PPath(Path):
> def __init__(self, *args, **kwargs):
> super().__init__(*args, **kwargs)
>
> test = PPath("dir", "test.txt")
>
>
This gives the following error message.
> Traceback (most recent call last):
> File "/Users/projetmbc/test.py", line 14, in <module&…
[View More]gt;
> test = PPath("dir", "test.txt")
> File "/anaconda/lib/python3.4/pathlib.py", line 907, in __new__
> self = cls._from_parts(args, init=False)
> File "/anaconda/lib/python3.4/pathlib.py", line 589, in _from_parts
> drv, root, parts = self._parse_args(args)
> File "/anaconda/lib/python3.4/pathlib.py", line 582, in _parse_args
> return cls._flavour.parse_parts(parts)AttributeError: type object 'PPath' has no attribute '_flavour'
>
>
This breaks the sub-classing from Python point of view. In the post
<http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path…>,
I give a hack to sub-class Path but it's a bit Unpythonic.
*Christophe BAL*
*Enseignant de mathématiques en Lycée **et développeur Python amateur*
*---*
*French math teacher in a "Lycée" **and **Python **amateur developer*
[View Less]
4
3
According to https://www.python.org/dev/peps/pep-0492/#id31:
The [types.coroutine] function applies CO_COROUTINE flag to
generator-function's code object, making it return a coroutine
object.
Further down in https://www.python.org/dev/peps/pep-0492/#id32:
[await] uses the yield from implementation with an extra step of
validating its argument. await only accepts an awaitable,
which can be one of:
...
- A generator-based coroutine object returned from a generator
…
[View More] decorated with types.coroutine().
If I'm understanding this correctly, type.coroutine's only purpose is to add
a flag to a generator object so that await will accept it.
This raises the question of why can't await simply accept a generator
object? There is no code change to the gen obj itself, there is no
behavior change in the gen obj, it's the exact same byte code, only a
flag is different.
types.coroutine feels a lot like unnecessary boiler-plate.
--
~Ethan~
[View Less]
3
5
ACTIVITY SUMMARY (2015-04-24 - 2015-05-01)
Python tracker at http://bugs.python.org/
To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.
Issues counts and deltas:
open 4841 (+27)
closed 31025 (+25)
total 35866 (+52)
Open issues with patches: 2254
Issues opened (39)
==================
#24054: Invalid syntax in inspect_fodder2.py (on Python 2.x)
http://bugs.python.org/issue24054 opened by ddriddle
#24055: unittest package-…
[View More]level set up & tear down module
http://bugs.python.org/issue24055 opened by demian.brecht
#24056: Expose closure & generator status in function repr()
http://bugs.python.org/issue24056 opened by ncoghlan
#24060: Clearify necessities for logging with timestamps
http://bugs.python.org/issue24060 opened by krichter
#24063: Support Mageia and Arch Linux in the platform module
http://bugs.python.org/issue24063 opened by akien
#24064: Make the property doctstring writeable
http://bugs.python.org/issue24064 opened by rhettinger
#24065: Outdated *_RESTRICTED flags in structmember.h
http://bugs.python.org/issue24065 opened by berker.peksag
#24066: send_message should take all the addresses in the To: header i
http://bugs.python.org/issue24066 opened by kirelagin
#24067: Weakproxy is an instance of collections.Iterator
http://bugs.python.org/issue24067 opened by ereuveni
#24068: statistics module - incorrect results with boolean input
http://bugs.python.org/issue24068 opened by wolma
#24069: Option to delete obsolete bytecode files
http://bugs.python.org/issue24069 opened by Sworddragon
#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076 opened by lukasz.langa
#24078: inspect.getsourcelines ignores context and returns wrong line
http://bugs.python.org/issue24078 opened by siyuan
#24079: xml.etree.ElementTree.Element.text does not conform to the doc
http://bugs.python.org/issue24079 opened by jlaurens
#24080: asyncio.Event.wait() Task was destroyed but it is pending
http://bugs.python.org/issue24080 opened by matt
#24081: Obsolete caveat in reload() docs
http://bugs.python.org/issue24081 opened by encukou
#24082: Obsolete note in argument parsing (c-api/arg.rst)
http://bugs.python.org/issue24082 opened by encukou
#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084 opened by Romuald
#24085: large memory overhead when pyc is recompiled
http://bugs.python.org/issue24085 opened by bukzor
#24086: Configparser interpolation is unexpected
http://bugs.python.org/issue24086 opened by tbekolay
#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087 opened by paul.moore
#24088: yield expression confusion
http://bugs.python.org/issue24088 opened by Jim.Jewett
#24089: argparse crashes with AssertionError
http://bugs.python.org/issue24089 opened by spaceone
#24090: Add a "copy variable to clipboard" option to the edit menu
http://bugs.python.org/issue24090 opened by irdb
#24091: Use after free in Element.extend (1)
http://bugs.python.org/issue24091 opened by pkt
#24092: Use after free in Element.extend (2)
http://bugs.python.org/issue24092 opened by pkt
#24093: Use after free in Element.remove
http://bugs.python.org/issue24093 opened by pkt
#24094: Use after free during json encoding (PyType_IsSubtype)
http://bugs.python.org/issue24094 opened by pkt
#24095: Use after free during json encoding a dict (2)
http://bugs.python.org/issue24095 opened by pkt
#24096: Use after free in get_filter
http://bugs.python.org/issue24096 opened by pkt
#24097: Use after free in PyObject_GetState
http://bugs.python.org/issue24097 opened by pkt
#24098: Multiple use after frees in obj2ast_* methods
http://bugs.python.org/issue24098 opened by pkt
#24099: Use after free in siftdown (1)
http://bugs.python.org/issue24099 opened by pkt
#24100: Use after free in siftdown (2)
http://bugs.python.org/issue24100 opened by pkt
#24101: Use after free in siftup
http://bugs.python.org/issue24101 opened by pkt
#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102 opened by pkt
#24103: Use after free in xmlparser_setevents (1)
http://bugs.python.org/issue24103 opened by pkt
#24104: Use after free in xmlparser_setevents (2)
http://bugs.python.org/issue24104 opened by pkt
#24105: Use after free during json encoding a dict (3)
http://bugs.python.org/issue24105 opened by pkt
Most recent 15 issues with no replies (15)
==========================================
#24105: Use after free during json encoding a dict (3)
http://bugs.python.org/issue24105
#24104: Use after free in xmlparser_setevents (2)
http://bugs.python.org/issue24104
#24103: Use after free in xmlparser_setevents (1)
http://bugs.python.org/issue24103
#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102
#24101: Use after free in siftup
http://bugs.python.org/issue24101
#24100: Use after free in siftdown (2)
http://bugs.python.org/issue24100
#24099: Use after free in siftdown (1)
http://bugs.python.org/issue24099
#24098: Multiple use after frees in obj2ast_* methods
http://bugs.python.org/issue24098
#24097: Use after free in PyObject_GetState
http://bugs.python.org/issue24097
#24095: Use after free during json encoding a dict (2)
http://bugs.python.org/issue24095
#24094: Use after free during json encoding (PyType_IsSubtype)
http://bugs.python.org/issue24094
#24093: Use after free in Element.remove
http://bugs.python.org/issue24093
#24092: Use after free in Element.extend (2)
http://bugs.python.org/issue24092
#24091: Use after free in Element.extend (1)
http://bugs.python.org/issue24091
#24090: Add a "copy variable to clipboard" option to the edit menu
http://bugs.python.org/issue24090
Most recent 15 issues waiting for review (15)
=============================================
#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087
#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084
#24082: Obsolete note in argument parsing (c-api/arg.rst)
http://bugs.python.org/issue24082
#24081: Obsolete caveat in reload() docs
http://bugs.python.org/issue24081
#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076
#24068: statistics module - incorrect results with boolean input
http://bugs.python.org/issue24068
#24066: send_message should take all the addresses in the To: header i
http://bugs.python.org/issue24066
#24064: Make the property doctstring writeable
http://bugs.python.org/issue24064
#24056: Expose closure & generator status in function repr()
http://bugs.python.org/issue24056
#24054: Invalid syntax in inspect_fodder2.py (on Python 2.x)
http://bugs.python.org/issue24054
#24053: Define EXIT_SUCCESS and EXIT_FAILURE constants in sys
http://bugs.python.org/issue24053
#24042: Convert os._getfullpathname() and os._isdir() to Argument Clin
http://bugs.python.org/issue24042
#24037: Argument Clinic: add the boolint converter
http://bugs.python.org/issue24037
#24036: GB2312 codec is using a wrong covert table
http://bugs.python.org/issue24036
#24034: Make fails Objects/typeslots.inc
http://bugs.python.org/issue24034
Top 10 most discussed issues (10)
=================================
#24053: Define EXIT_SUCCESS and EXIT_FAILURE constants in sys
http://bugs.python.org/issue24053 30 msgs
#24067: Weakproxy is an instance of collections.Iterator
http://bugs.python.org/issue24067 12 msgs
#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076 11 msgs
#24085: large memory overhead when pyc is recompiled
http://bugs.python.org/issue24085 11 msgs
#23749: asyncio missing wrap_socket
http://bugs.python.org/issue23749 10 msgs
#24018: add a Generator ABC
http://bugs.python.org/issue24018 10 msgs
#24052: sys.exit(code) returns "success" to the OS for some nonzero va
http://bugs.python.org/issue24052 9 msgs
#23955: Add python.ini file for embedded/applocal installs
http://bugs.python.org/issue23955 8 msgs
#17908: Unittest runner needs an option to call gc.collect() after eac
http://bugs.python.org/issue17908 7 msgs
#24043: Implement mac_romanian and mac_croatian encodings
http://bugs.python.org/issue24043 6 msgs
Issues closed (23)
==================
#9246: os.getcwd() hardcodes max path len
http://bugs.python.org/issue9246 closed by haypo
#21354: PyCFunction_New no longer exposed by python DLL breaking bdist
http://bugs.python.org/issue21354 closed by asvetlov
#23058: argparse silently ignores arguments
http://bugs.python.org/issue23058 closed by barry
#23342: run() - unified high-level interface for subprocess
http://bugs.python.org/issue23342 closed by gregory.p.smith
#23356: In argparse docs simplify example about argline
http://bugs.python.org/issue23356 closed by berker.peksag
#23668: Support os.ftruncate on Windows
http://bugs.python.org/issue23668 closed by steve.dower
#23852: Wrong computation of max_fd on OpenBSD
http://bugs.python.org/issue23852 closed by gregory.p.smith
#23910: property_descr_get reuse argument tuple
http://bugs.python.org/issue23910 closed by rhettinger
#23996: _PyGen_FetchStopIterationValue() crashes on unnormalised excep
http://bugs.python.org/issue23996 closed by pitrou
#24035: When Caps Locked, <Shift> + alpha-character still displayed as
http://bugs.python.org/issue24035 closed by principia1687
#24057: trivial typo in datetime.rst: needing a preceding dot
http://bugs.python.org/issue24057 closed by python-dev
#24058: Compiler warning for readline extension
http://bugs.python.org/issue24058 closed by python-dev
#24059: Minor speed and readability improvement to the random module
http://bugs.python.org/issue24059 closed by rhettinger
#24061: Python 2.x breaks with address sanitizer
http://bugs.python.org/issue24061 closed by python-dev
#24062: links to os.stat() in documentation lead to stat module instea
http://bugs.python.org/issue24062 closed by berker.peksag
#24070: Exceptions and arguments disappear when using argparse inside
http://bugs.python.org/issue24070 closed by benjamin.peterson
#24071: Python 2.7.8, 2.7.9 re.MULTILINE failure
http://bugs.python.org/issue24071 closed by serhiy.storchaka
#24072: xml.etree.ElementTree.Element does not catch text
http://bugs.python.org/issue24072 closed by ned.deily
#24073: sys.stdin.mode can not give the right mode and os.fdopen does
http://bugs.python.org/issue24073 closed by ned.deily
#24074: string, center, ljust, rjust, width paramter should accept Non
http://bugs.python.org/issue24074 closed by rhettinger
#24075: list.sort() should do quick exit if len(list) <= 1
http://bugs.python.org/issue24075 closed by benjamin.peterson
#24077: man page says -I implies -S. code says -s.
http://bugs.python.org/issue24077 closed by ned.deily
#24083: MSVCCompiler.get_msvc_path() doesn't work on Win x64
http://bugs.python.org/issue24083 closed by lemburg
[View Less]
1
0
I've tried to catch up with the previous threads. A summary of issues
brought up:
1. precise syntax of `async def` (or do we need it at all)
2. do we need `async for` and `async with` (and how to spell them)
3. syntactic priority of `await`
4. `cocall` vs. `await`
5. do we really need `__aiter__` and friends
6. StopAsyncException
7. compatibility with asyncio and existing users of it
(I've added a few myself.)
I'll try to take them one by one.
*1. precise syntax of `async def`*
Of all the …
[View More]places to put `async` I still like *before* the `def` the best.
I often do "imprecise search" for e.g. /def foo/ and would be unhappy if
this didn't find async defs. Putting it towards the end (`def foo async()`
or `def foo() async`) makes it easier to miss. A decorator makes it hard to
make the syntactic distinctions required to reject `await` outside an async
function. So I still prefer *`async def`*.
*2. do we need `async for` and `async with`*
Yes we do. Most of you are too young to remember, but once upon a time you
couldn't loop over the lines of a file with a `for` loop like you do now.
The amount of code that was devoted to efficiently iterate over files was
tremendous. We're back in that stone age with the asyncio `StreamReader`
class -- it supports `read()`, `readline()` and so on, but you can't use it
with `for`, so you have to write a `while True` loop. `asyncio for` makes
it possible to add a simple `__anext__` to the `StreamReader` class, as
follows:
```
async def __anext__(self):
line = await self.readline()
if not line:
raise StopAsyncIteration
return line
```
A similar argument can be made for `async with`; the transaction commit is
pretty convincing, but it also helps to be able to wait e.g. for a
transport to drain upon closing a write stream. As for how to spell these,
I think having `async` at the front makes it most clear that this is a
special form.
(Though maybe we should consider `await for` and `await with`? That would
have the advantage of making it easy to scan for all suspension points by
searching for /await/. But being a verb it doesn't read very well.)
*3. syntactic priority of `await`*
Yury, could you tweak the syntax for `await` so that we can write the most
common usages without parentheses? In particular I'd like to be able to
write
```
return await foo()
with await foo() as bar: ...
foo(await bar(), await bletch())
```
(I don't care about `await foo() + await bar()` but it would be okay.)
```
I think this is reasonable with some tweaks of the grammar (similar to what
Greg did for cocall, but without requiring call syntax at the end).
*4. `cocall` vs. `await`*
Python evolves. We couldn't have PEP 380 (`yield from`) without prior
experience with using generators as coroutines (PEP 342), which in turn
required basic generators (PEP 255), and those were a natural evolution of
Python's earlier `for` loop.
We couldn't PEP 3156 (asyncio) without PEP 380 and all that came before.
The asyncio library is getting plenty of adoption and it has the concept of
separating the *getting* of a future[1] from *waiting* for it. IIUC this
is also how `await` works in C# (it just requires something with an async
type). This has enabled a variety of operations that take futures and
produce more futures.
[1] I write `future` with a lowercase 'f' to include concepts like
coroutine generator objects.
*I just can't get used to this aspect of PEP 3152, so I'm rejecting it.*
Sorry Greg, but that's the end. We must see `await` as a refinement of
`yield from`, not as an alternative. (Yury: PEP 492 is not accepted yet,
but you're getting closer.)
One more thing: this separation is "Pythonic" in the sense that it's
similar to the way *getting* a callable object is a separate act from
*calling* it. While this is a cause for newbie bugs (forgetting to call an
argument-less function) it has also enabled the concept of "callable" as
more general and more powerful in Python: any time you need to pass a
callable, you can pass e.g. a bound method or a class or something you got
from `functools.partial`, and that's a useful thing (other languages
require you to introduce something like a lambda in such cases, which can
be painful if the thing you wrap has a complex signature -- or they don't
support function parameters at all, like Java).
I know that Greg defends it by explaining that `cocal f(args)` is not a
`cocall` operator applied to `f(args)`, it is the *single* operator `cocall
...(args)` applied to `f`. But this is too subtle, and it just doesn't jive
with the long tradition of using `yield from f` where f is some previously
obtained future.
*5. do we really need `__aiter__` and friends*
There's a lot of added complexity, but I think it's worth it. I don't think
we need to make the names longer, the 'a' prefix is fine for these methods.
I think it's all in the protocols: regular `with` uses `__enter__` and
`__exit__`; `async with` uses `__aenter__` and `__aexit__` (which must
return futures).
Ditto for `__aiter__` and `__anext__`. I guess this means that the async
equivalent to obtaining an iterator through `it = iter(xs)` followed by
`for x over it` will have to look like `ait = await aiter(xs)` followed by
`for x over ait`, where an iterator is required to have an `__aiter__`
method that's an async function and returns self immediately. But what if
you left out the `await` from the first call? I.e. can this work?
```
ait = aiter(xs)
async for x in ait:
print(x)
```
The question here is whether the object returned by aiter(xs) has an
`__aiter__` method. Since it was intended to be the target of `await`, it
has an `__await__` method. But that itself is mostly an alias for
`__iter__`, not `__aiter__`. I guess it can be made to work, the object
just has to implement a bunch of different protocols.
*6. StopAsyncException*
I'm not sure about this. The motivation given in the PEP seems to focus on
the need for `__anext__` to be async. But is this really the right pattern?
What if we required `ait.__anext__()` to return a future, which can either
raise good old `StopIteration` or return the next value from the iteration
when awaited? I'm wondering if there are a few alternatives to be explored
around the async iterator protocol still.
*7. compatibility with asyncio and existing users of it*
This is just something I want to stress. On the one hand it should be
really simple to take code written for pre-3.5 asyncio and rewrite it to
PEP 492 -- simple change `(a)asyncio.coroutine` to `async def` and change
`yield from` to `await`. (Everything else is optional; existing patterns
for loops and context managers should continue to work unchanged, even if
in some cases you may be able to simplify the code by using `async for` and
`async with`.)
But it's also important that *even if you don't switch* (i.e. if you want
to keep your code compatible with pre-3.5 asyncio) you can still use the
PEP 492 version of asyncio -- i.e. the asyncio library that comes with 3.5
must seamlessly support mixing code that uses `await` and code that uses
`yield from`. And this should go both ways -- if you have some code that
uses PEP 492 and some code that uses pre-3.5 asyncio, they should be able
to pass their coroutines to each other and wait for each other's coroutines.
That's all I have for now. Enjoy!
--
--Guido van Rossum (python.org/~guido)
[View Less]
14
70