I propose adding the ability to compare range objects using methods (e.g.
range.issubrange) and/or regular operators. Example:
In [56]: range(0, 10, 3).issubrange(range(10))
Out[56]: True
In [57]: range(0, 10, 3) <= range(10)
Out[57]: True
In [58]: range(10) <= range(0, 10, 3)
Out[58]: False
I'll write a patch if you decide that this idea is worth implementing.
threading.local provides thread-local namespaces. As far as I can
tell, there isn't an equivalent for coroutines (asyncio), even though
I would expect they would provide the same benefits. I'd like to see
coroutine-local namespaces added and would be happy to work on the
problem. I plan on working on a feature that will rely on applying a
thread-local context and realized that coroutines would need a similar
treatment. Also, there are probably a few spots in the stdlib that
would benefit (e.g. decimal contexts).
Perhaps I'm missing something. Is the concurrency story with
coroutines different enough that coroutine-local namespaces do not
make sense?
If they still make sense, is there a mechanism to uniquely identify
the "currently running" coroutine like you can with threads? That
would be necessary to do a naive port of thread-local namespaces.
-eric
Observe:
ryan@DevPC-LX:~/bfort$ python -c 'import sys; sys.exit(1)'
ryan@DevPC-LX:~/bfort$ python -c 'import sys; sys.exit(1L)'
1
ryan@DevPC-LX:~/bfort$
If I call sys.exit with a long value under Python 2, it prints it. I count
this as a bug because:
1. int and long are supposed to be as similar as possible.
2. When you're using Hy (which implicitly makes all ints longs), this sucks.
--
Ryan
[ERROR]: Your autotools build scripts are 200 lines longer than your
program. Something’s wrong.
http://kirbyfan64.github.io/
Hello python-ideas,
Here's my proposal to add async/await in Python.
I believe that PEPs 380 and 3156 were a major breakthrough for Python 3,
and I hope that this PEP can be another big step. Who knows, maybe it
will be one of the reasons to drive people's interest towards Python 3.
PEP: XXX
Title: Coroutines with async and await syntax
Version: $Revision$
Last-Modified: $Date$
Author: Yury Selivanov <yselivanov(a)sprymix.com>
Discussions-To: Python-Dev <python-dev(a)python.org>
Python-Version: 3.5
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 09-Apr-2015
Post-History:
Resolution:
Abstract
========
This PEP introduces new syntax for coroutines, asynchronous ``with``
statements and ``for`` loops. The main motivation behind this proposal
is to
streamline writing and maintaining asynchronous code, as well as to simplify
previously hard to implement code patterns.
Rationale and Goals
===================
Current Python supports implementing coroutines via generators (PEP 342),
further enhanced by the ``yield from`` syntax introduced in PEP 380.
This approach has a number of shortcomings:
* it is easy to confuse coroutines with regular generators, since they share
the same syntax; async libraries often attempt to alleviate this by using
decorators (e.g. ``(a)asyncio.coroutine`` [1]_);
* it is not possible to natively define a coroutine which has no ``yield``
or ``yield from`` statements, again requiring the use of decorators to
fix potential refactoring issues;
* support for asynchronous calls is limited to expressions where
``yield`` is
allowed syntactically, limiting the usefulness of syntactic features,
such
as ``with`` and ``for`` statements.
This proposal makes coroutines a native Python language feature, and clearly
separates them from generators. This removes generator/coroutine ambiguity,
and makes it possible to reliably define coroutines without reliance on a
specific library. This also enables linters and IDEs to improve static code
analysis and refactoring.
Native coroutines and the associated new syntax features make it possible
to define context manager and iteration protocols in asynchronous terms.
As shown later in this proposal, the new ``async with`` statement lets
Python
programs perform asynchronous calls when entering and exiting a runtime
context, and the new ``async for`` statement makes it possible to perform
asynchronous calls in iterators.
Specification
=============
This proposal introduces new syntax and semantics to enhance coroutine
support
in Python, it does not change the internal implementation of coroutines,
which
are still based on generators.
It is strongly suggested that the reader understands how coroutines are
implemented in Python (PEP 342 and PEP 380). It is also recommended to read
PEP 3156 (asyncio framework).
From this point in this document we use the word *coroutine* to refer to
functions declared using the new syntax. *generator-based coroutine* is
used
where necessary to refer to coroutines that are based on generator syntax.
New Coroutine Declaration Syntax
--------------------------------
The following new syntax is used to declare a coroutine::
async def read_data(db):
pass
Key properties of coroutines:
* Coroutines are always generators, even if they do not contain ``await``
expressions.
* It is a ``SyntaxError`` to have ``yield`` or ``yield from`` expressions in
an ``async`` function.
* Internally, a new code object flag - ``CO_ASYNC`` - is introduced to
enable
runtime detection of coroutines (and migrating existing code).
All coroutines have both ``CO_ASYNC`` and ``CO_GENERATOR`` flags set.
* Regular generators, when called, return a *generator object*; similarly,
coroutines return a *coroutine object*.
* ``StopIteration`` exceptions are not propagated out of coroutines, and are
replaced with a ``RuntimeError``. For regular generators such behavior
requires a future import (see PEP 479).
types.async_def()
-----------------
A new function ``async_def(gen)`` is added to the ``types`` module. It
applies ``CO_ASYNC`` flag to the passed generator's code object, so that it
returns a *coroutine object* when called.
This feature enables an easy upgrade path for existing libraries.
Await Expression
----------------
The following new ``await`` expression is used to obtain a result of
coroutine
execution::
async def read_data(db):
data = await db.fetch('SELECT ...')
...
``await``, similarly to ``yield from``, suspends execution of ``read_data``
coroutine until ``db.fetch`` *awaitable* completes and returns the result
data.
It uses the ``yield from`` implementation with an extra step of
validating its
argument. ``await`` only accepts an *awaitable*, which can be one of:
* A *coroutine object* returned from a coroutine or a generator
decorated with
``types.async_def()``.
* An object with an ``__await__`` method returning an iterator.
Any ``yield from`` chain of calls ends with a ``yield``. This is a
fundamental mechanism of how *Futures* are implemented. Since,
internally,
coroutines are a special kind of generators, every ``await`` is
suspended by
a ``yield`` somewhere down the chain of ``await`` calls (please refer
to PEP
3156 for a detailed explanation.)
To enable this behavior for coroutines, a new magic method called
``__await__`` is added. In asyncio, for instance, to enable Future
objects
in ``await`` statements, the only change is to add ``__await__ =
__iter__``
line to ``asyncio.Future`` class.
Objects with ``__await__`` method are called *Future-like* objects in the
rest of this PEP.
Also, please note that ``__aiter__`` method (see its definition
below) cannot
be used for this purpose. It is a different protocol, and would be like
using ``__iter__`` instead of ``__call__`` for regular callables.
It is a ``SyntaxError`` to use ``await`` outside of a coroutine.
Asynchronous Context Managers and "async with"
----------------------------------------------
An *asynchronous context manager* is a context manager that is able to
suspend
execution in its *enter* and *exit* methods.
To make this possible, a new protocol for asynchronous context managers is
proposed. Two new magic methods are added: ``__aenter__`` and
``__aexit__``.
Both must return an *awaitable*.
An example of an asynchronous context manager::
class AsyncContextManager:
async def __aenter__(self):
await log('entering context')
async def __aexit__(self, exc_type, exc, tb):
await log('exiting context')
New Syntax
''''''''''
A new statement for asynchronous context managers is proposed::
async with EXPR as VAR:
BLOCK
which is semantically equivalent to::
mgr = (EXPR)
aexit = type(mgr).__aexit__
aenter = type(mgr).__aenter__(mgr)
exc = True
try:
try:
VAR = await aenter
BLOCK
except:
exc = False
exit_res = await aexit(mgr, *sys.exc_info())
if not exit_res:
raise
finally:
if exc:
await aexit(mgr, None, None, None)
As with regular ``with`` statements, it is possible to specify multiple
context
managers in a single ``async with`` statement.
It is an error to pass a regular context manager without ``__aenter__`` and
``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use
``async with`` outside of a coroutine.
Example
'''''''
With asynchronous context managers it is easy to implement proper database
transaction managers for coroutines::
async def commit(session, data):
...
async with session.transaction():
...
await session.update(data)
...
Code that needs locking also looks lighter::
async with lock:
...
instead of::
with (yield from lock):
...
Asynchronous Iterators and "async for"
--------------------------------------
An *asynchronous iterable* is able to call asynchronous code in its *iter*
implementation, and *asynchronous iterator* can call asynchronous code
in its
*next* method. To support asynchronous iteration:
1. An object must implement an ``__aiter__`` method returning an
*awaitable*
resulting in an *asynchronous iterator object*.
2. An *asynchronous iterator object* must implement an ``__anext__`` method
returning an *awaitable*.
3. To stop iteration```__anext__`` must raise a ``StopAsyncIteration``
exception.
An example of asynchronous iterable::
class AsyncIterable:
async def __aiter__(self):
return self
async def __anext__(self):
data = await self.fetch_data()
if data:
return data
else:
raise StopAsyncIteration
async def fetch_data(self):
...
New Syntax
''''''''''
A new statement for iterating through asynchronous iterators is proposed::
async for TARGET in ITER:
BLOCK
else:
BLOCK2
which is semantically equivalent to::
iter = (ITER)
iter = await type(iter).__aiter__(iter)
running = True
while running:
try:
TARGET = await type(iter).__anext__(iter)
except StopAsyncIteration:
running = False
else:
BLOCK
else:
BLOCK2
It is an error to pass a regular iterable without ``__aiter__`` method to
``async for``. It is a ``SyntaxError`` to use ``async for`` outside of a
coroutine.
As for with regular ``for`` statement, ``async for`` has an optional
``else``
clause.
Example 1
'''''''''
With asynchronous iteration protocol it is possible to asynchronously buffer
data during iteration::
async for data in cursor:
...
Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows
of data from a database after every ``N`` iterations.
The following code illustrates new asynchronous iteration protocol::
class Cursor:
def __init__(self):
self.buffer = collections.deque()
def _prefetch(self):
...
async def __aiter__(self):
return self
async def __anext__(self):
if not self.buffer:
self.buffer = await self._prefetch()
if not self.buffer:
raise StopAsyncIteration
return self.buffer.popleft()
then the ``Cursor`` class can be used as follows::
async for row in Cursor():
print(row)
which would be equivalent to the following code::
i = await Cursor().__aiter__()
while True:
try:
row = await i.__anext__()
except StopAsyncIteration:
break
else:
print(row)
Example 2
'''''''''
The following is a utility class that transforms a regular iterable to an
asynchronous one. While this is not a very useful thing to do, the code
illustrates the relationship between regular and asynchronous iterators.
::
class AsyncIteratorWrapper:
def __init__(self, obj):
self._it = iter(obj)
async def __aiter__(self):
return self
async def __anext__(self):
try:
value = next(self._it)
except StopIteration:
raise StopAsyncIteration
return value
data = "abc"
it = AsyncIteratorWrapper("abc")
async for item in it:
print(it)
Why StopAsyncIteration?
'''''''''''''''''''''''
Coroutines are still based on generators internally. So, before PEP
479, there
was no fundamental difference between
::
def g1():
yield from fut
return 'spam'
and
::
def g2():
yield from fut
raise StopIteration('spam')
And since PEP 479 is accepted and enabled by default for coroutines, the
following example will have its ``StopIteration`` wrapped into a
``RuntimeError``
::
async def a1():
await fut
raise StopIteration('spam')
The only way to tell the outside code that the iteration has ended is to
raise
something other than ``StopIteration``. Therefore, a new built-in exception
class ``StopAsyncIteration`` was added.
Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions
raised
in coroutines are wrapped in ``RuntimeError``.
Debugging Features
------------------
One of the most frequent mistakes that people make when using generators as
coroutines is forgetting to use ``yield from``::
@asyncio.coroutine
def useful():
asyncio.sleep(1) # this will do noting without 'yield from'
For debugging this kind of mistakes there is a special debug mode in
asyncio,
in which ``@coroutine`` decorator wraps all functions with a special object
with a destructor logging a warning. Whenever a wrapped generator gets
garbage
collected, a detailed logging message is generated with information
about where
exactly the decorator function was defined, stack trace of where it was
collected, etc. Wrapper object also provides a convenient ``__repr__``
function with detailed information about the generator.
The only problem is how to enable these debug capabilities. Since debug
facilities should be a no-op in production mode, ``@coroutine``
decorator makes
the decision of whether to wrap or not to wrap based on an OS environment
variable ``PYTHONASYNCIODEBUG``. This way it is possible to run asyncio
programs with asyncio's own functions instrumented.
``EventLoop.set_debug``, a
different debug facility, has no impact on ``@coroutine`` decorator's
behavior.
With this proposal, coroutines is a native, distinct from generators,
concept. A new method ``set_async_wrapper`` is added to the ``sys`` module,
with which frameworks can provide advanced debugging facilities.
It is also important to make coroutines as fast and efficient as possible,
therefore there are no debug features enabled by default.
Example::
async def debug_me():
await asyncio.sleep(1)
def async_debug_wrap(generator):
return asyncio.AsyncDebugWrapper(generator)
sys.set_async_wrapper(async_debug_wrap)
debug_me() # <- this line will likely GC the coroutine object and
# trigger AsyncDebugWrapper's code.
assert isinstance(debug_me(), AsyncDebugWrapper)
sys.set_async_wrapper(None) # <- this unsets any previously set
wrapper
assert not isinstance(debug_me(), AsyncDebugWrapper)
If ``sys.set_async_wrapper()`` is called twice, the new wrapper replaces the
previous wrapper. ``sys.set_async_wrapper(None)`` unsets the wrapper.
Glossary
========
:Coroutine:
A coroutine function, or just "coroutine", is declared with ``async
def``.
It uses ``await`` and ``return value``; see `New Coroutine Declaration
Syntax`_ for details.
:Coroutine object:
Returned from a coroutine function. See `Await Expression`_ for
details.
:Future-like object:
An object with an ``__await__`` method. It is consumed by
``await`` in a
coroutine. A coroutine waiting for a Future-like object is
suspended until
the Future-like object's ``__await__`` completes. ``await``
returns the
result of the Future-like object. See `Await Expression`_ for details.
:Awaitable:
A *future-like* object or a *coroutine object*. See `Await
Expression`_
for details.
:Generator-based coroutine:
Coroutines based in generator syntax. Most common example is
``(a)asyncio.coroutine``.
:Asynchronous context manager:
An asynchronous context manager has ``__aenter__`` and ``__aexit__``
methods
and can be used with ``async with``. See
`Asynchronous Context Managers and "async with"`_ for details.
:Asynchronous iterable:
An object with an ``__aiter__`` method, which must return an
*asynchronous
iterator* object. Can be used with ``async for``. See
`Asynchronous Iterators and "async for"`_ for details.
:Asynchronous iterator:
An asynchronous iterator has an ``__anext__`` method.See
`Asynchronous Iterators and "async for"`_ for details.
List of functions and methods
=============================
================= ======================================= =================
Method Can contain Can't contain
================= ======================================= =================
async def func await, return value yield, yield from
async def __a*__ await, return value yield, yield from
def __a*__ return Future-like await
def __await__ yield, yield from, return iterable await
generator yield, yield from, return value await
================= ======================================= =================
Where:
* ""async def func": coroutine;
* "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
``__aexit__`` defined with the ``async`` keyword;
* "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__``
defined without the ``async`` keyword, must return an *awaitable*;
* "def __await__": ``__await__`` method to implement *Future-like* objects;
* generator: a "regular" generator, function defined with ``def`` and which
contains a least one ``yield`` or ``yield from`` expression.
*Future-like* is an object with an ``__await__`` method, see
`Await Expression`_ section for details.
Transition Plan
===============
To avoid backwards compatibility issues with ``async`` and ``await``
keywords,
it was decided to modify ``tokenizer.c`` in such a way, that it:
* recognizes ``async def`` name tokens combination (start of a coroutine);
* keeps track of regular functions and coroutines;
* replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with
``AWAIT`` when in the process of yielding tokens for coroutines.
This approach allows for seamless combination of new syntax features (all of
them available only in ``async`` functions) with any existing code.
An example of having "async def" and "async" attribute in one piece of
code::
class Spam:
async = 42
async def ham():
print(getattr(Spam, 'async'))
# The coroutine can be executed and will print '42'
Backwards Compatibility
-----------------------
The only backwards incompatible change is an extra argument ``is_async`` to
``FunctionDef`` AST node. But since it is a documented fact that the
structure
of AST nodes is an implementation detail and subject to change, this
should not
be considered a serious issue.
Grammar Updates
---------------
Grammar changes are also fairly minimal::
await_expr: AWAIT test
await_stmt: await_expr
decorated: decorators (classdef | funcdef | async_funcdef)
async_funcdef: ASYNC funcdef
async_stmt: ASYNC (funcdef | with_stmt) # will add for_stmt later
compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt
| funcdef | classdef | decorated | async_stmt)
atom: ('(' [yield_expr|await_expr|testlist_comp] ')' |
'[' [testlist_comp] ']' |
'{' [dictorsetmaker] '}' |
NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False’)
expr_stmt: testlist_star_expr (augassign
(yield_expr|await_expr|testlist) |
('=' (yield_expr|await_expr|testlist_star_expr))*)
Transition Period Shortcomings
------------------------------
There is just one.
Until ``async`` and ``await`` are not proper keywords, it is not
possible (or
at least very hard) to fix ``tokenizer.c`` to recognize them on the **same
line** with ``def`` keyword::
# async and await will always be parsed as variables
async def outer(): # 1
def nested(a=(await fut)):
pass
async def foo(): return (await fut) # 2
Since ``await`` and ``async`` in such cases are parsed as ``NAME`` tokens, a
``SyntaxError`` will be raised.
To workaround these issues, the above examples can be easily rewritten to a
more readable form::
async def outer(): # 1
a_default = await fut
def nested(a=a_default):
pass
async def foo(): # 2
return (await fut)
This limitation will go away as soon as ``async`` and ``await`` ate proper
keywords. Or if it's decided to use a future import for this PEP.
Deprecation Plans
-----------------
``async`` and ``await`` names will be softly deprecated in CPython 3.5
and 3.6.
In 3.7 we will transform them to proper keywords. Making ``async`` and
``await`` proper keywords before 3.7 might make it harder for people to port
their code to Python 3.
asyncio
-------
``asyncio`` module was adapted and tested to work with coroutines and new
statements. Backwards compatibility is 100% preserved.
The required changes are mainly:
1. Modify ``(a)asyncio.coroutine`` decorator to use new ``types.async_def()``
function.
2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
3. Add ``ensure_task()`` as an alias for ``async()`` function. Deprecate
``async()`` function.
Design Considerations
=====================
No implicit wrapping in Futures
-------------------------------
There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A key
difference is that JavaScript "async functions" always return a Promise.
While
this approach has some advantages, it also implies that a new Promise
object is
created on each "async function" invocation.
We could implement a similar functionality in Python, by wrapping all
coroutines in a Future object, but this has the following disadvantages:
1. Performance. A new Future object would be instantiated on each coroutine
call. Moreover, this makes implementation of ``await`` expressions
slower
(disabling optimizations of ``yield from``).
2. A new built-in ``Future`` object would need to be added.
3. Coming up with a generic ``Future`` interface that is usable for any use
case in any framework is a very hard to solve problem.
4. It is not a feature that is used frequently, when most of the code is
coroutines.
Why "async" and "await" keywords
--------------------------------
async/await is not a new concept in programming languages:
* C# has it since long time ago [5]_;
* proposal to add async/await in ECMAScript 7 [2]_;
see also Traceur project [9]_;
* Facebook's Hack/HHVM [6]_;
* Google's Dart language [7]_;
* Scala [8]_;
* proposal to add async/await to C++ [10]_;
* and many other less popular languages.
This is a huge benefit, as some users already have experience with
async/await,
and because it makes working with many languages in one project easier
(Python
with ECMAScript 7 for instance).
Why "__aiter__" is a coroutine
------------------------------
In principle, ``__aiter__`` could be a regular function. There are several
good reasons to make it a coroutine:
* as most of the ``__anext__``, ``__aenter__``, and ``__aexit__``
methods are
coroutines, users would often make a mistake defining it as ``async``
anyways;
* there might be a need to run some asynchronous operations in
``__aiter__``,
for instance to prepare DB queries or do some file operation.
Importance of "async" keyword
-----------------------------
While it is possible to just implement ``await`` expression and treat all
functions with at least one ``await`` as coroutines, this approach makes
APIs design, code refactoring and its long time support harder.
Let's pretend that Python only has ``await`` keyword::
def useful():
...
await log(...)
...
def important():
await useful()
If ``useful()`` function is refactored and someone removes all ``await``
expressions from it, it would become a regular python function, and all code
that depends on it, including ``important()`` would be broken. To
mitigate this
issue a decorator similar to ``(a)asyncio.coroutine`` has to be introduced.
Why "async def"
---------------
For some people bare ``async name(): pass`` syntax might look more appealing
than ``async def name(): pass``. It is certainly easier to type. But on the
other hand, it breaks the symmetry between ``async def``, ``async with`` and
``async for``, where ``async`` is a modifier, stating that the statement is
asynchronous. It is also more consistent with the existing grammar.
Why not a __future__ import
---------------------------
``__future__`` imports are inconvenient and easy to forget to add. Also,
they
are enabled for the whole source file. Consider that there is a big project
with a popular module named "async.py". With future imports it is
required to
either import it using ``__import__()`` or ``importlib.import_module()``
calls,
or to rename the module. The proposed approach makes it possible to
continue
using old code and modules without a hassle, while coming up with a
migration
plan for future python versions.
Why magic methods start with "a"
--------------------------------
New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``,
and ``__aexit__`` all start with the same prefix "a". An alternative
proposal
is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``.
However, to align new magic methods with the existing ones, such as
``__radd__`` and ``__iadd__`` it was decided to use a shorter version.
Why not reuse existing magic names
----------------------------------
An alternative idea about new asynchronous iterators and context
managers was
to reuse existing magic methods, by adding an ``async`` keyword to their
declarations::
class CM:
async def __enter__(self): # instead of __aenter__
...
This approach has the following downsides:
* it would not be possible to create an object that works in both
``with`` and
``async with`` statements;
* it would look confusing and would require some implicit magic behind the
scenes in the interpreter;
* one of the main points of this proposal is to make coroutines as simple
and foolproof as possible.
Comprehensions
--------------
For the sake of restricting the broadness of this PEP there is no new syntax
for asynchronous comprehensions. This should be considered in a
separate PEP,
if there is a strong demand for this feature.
Performance
===========
Overall Impact
--------------
This proposal introduces no observable performance impact. Here is an
output
of python's official set of benchmarks [4]_:
::
python perf.py -r -b default ../cpython/python.exe
../cpython-aw/python.exe
[skipped]
Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0:
Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64
x86_64 i386
Total CPU cores: 8
### etree_iterparse ###
Min: 0.365359 -> 0.349168: 1.05x faster
Avg: 0.396924 -> 0.379735: 1.05x faster
Significant (t=9.71)
Stddev: 0.01225 -> 0.01277: 1.0423x larger
The following not significant results are hidden, use -v to show them:
django_v2, 2to3, etree_generate, etree_parse, etree_process,
fastpickle,
fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.
Tokenizer modifications
-----------------------
There is no observable slowdown of parsing python files with the modified
tokenizer: parsing of one 12Mb file (``Lib/test/test_binop.py`` repeated
1000
times) takes the same amount of time.
async/await
-----------
The following micro-benchmark was used to determine performance difference
between "async" functions and generators::
import sys
import time
def binary(n):
if n <= 0:
return 1
l = yield from binary(n - 1)
r = yield from binary(n - 1)
return l + 1 + r
async def abinary(n):
if n <= 0:
return 1
l = await abinary(n - 1)
r = await abinary(n - 1)
return l + 1 + r
def timeit(gen, depth, repeat):
t0 = time.time()
for _ in range(repeat):
list(gen(depth))
t1 = time.time()
print('{}({}) * {}: total {:.3f}s'.format(
gen.__name__, depth, repeat, t1-t0))
The result is that there is no observable performance difference. Minimum
timing of 3 runs
::
abinary(19) * 30: total 12.985s
binary(19) * 30: total 12.953s
Note that depth of 19 means 1,048,575 calls.
Reference Implementation
========================
The reference implementation can be found here: [3]_.
List of high-level changes and new protocols
--------------------------------------------
1. New syntax for defining coroutines: ``async def`` and new ``await``
keyword.
2. New ``__await__`` method for Future-like objects.
3. New syntax for asynchronous context managers: ``async with``. And
associated protocol with ``__aenter__`` and ``__aexit__`` methods.
4. New syntax for asynchronous iteration: ``async for``. And associated
protocol with ``__aiter__``, ``__aexit__`` and new built-in exception
``StopAsyncIteration``.
5. New AST nodes: ``AsyncFor``, ``AsyncWith``, ``Await``;
``FunctionDef`` AST
node got a new argument ``is_async``.
6. New functions: ``sys.set_async_wrapper(callback)`` and
``types.async_def(gen)``.
7. New ``CO_ASYNC`` bit flag for code objects.
While the list of changes and new things is not short, it is important to
understand, that most users will not use these features directly. It is
intended to be used in frameworks and libraries to provide users with
convenient to use and unambiguous APIs with ``async def``, ``await``,
``async
for`` and ``async with`` syntax.
Working example
---------------
All concepts proposed in this PEP are implemented [3]_ and can be tested.
::
import asyncio
async def echo_server():
print('Serving on localhost:8000')
await asyncio.start_server(handle_connection, 'localhost', 8000)
async def handle_connection(reader, writer):
print('New connection...')
while True:
data = await reader.read(8192)
if not data:
break
print('Sending {:.10}... back'.format(repr(data)))
writer.write(data)
loop = asyncio.get_event_loop()
loop.run_until_complete(echo_server())
try:
loop.run_forever()
finally:
loop.close()
References
==========
.. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
.. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions
.. [3] https://github.com/1st1/cpython/tree/await
.. [4] https://hg.python.org/benchmarks
.. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx
.. [6] http://docs.hhvm.com/manual/en/hack.async.php
.. [7] https://www.dartlang.org/articles/await-async/
.. [8] http://docs.scala-lang.org/sips/pending/async.html
.. [9]
https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-func…
.. [10]
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)
Acknowledgments
===============
I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew
Svetlov,
and Łukasz Langa for their initial feedback.
Copyright
=========
This document has been placed in the public domain.
Yuri, thank you for writing PEP 492!
Victor Stinner reviewed the asyncio chapter in my book, Fluent Python
[1], and he saw how much I complained about the overloading of `def`
to define objects as different as functions, generators and
coroutines, and about `yield from` which doesn't say anything to me or
to many others to whom I've talked.
With `async def` and `await` you solve both these problems! Also,
`async with` and `async for` are brilliant. For the first time ever
coroutines will be real first class citizens in Python!
[1] http://shop.oreilly.com/product/0636920032519.do
Alas, the book [1] is entering final production so it won't cover
Python 3.5, but I do hope to see what you propose in Python 3.5, and
if there is a second edition, it will be a pleasure to explain `async`
and `await`!
I did have time to add a paragraph about PEP 492 in a Further reading
section, citing your name as the author.
Cheers,
Luciano
--
Luciano Ramalho
| Author of Fluent Python (O'Reilly, 2015)
| http://shop.oreilly.com/product/0636920032519.do
| Professor em: http://python.pro.br
| Twitter: @ramalhoorg
Hello,
I had an issue today with the `callable` builtin because it doesn't
correctly check that the object has the __call__ attribute.
Effectively what `callable(a)` does is `hasattr(type(a), '__call__')` but
that's not very straightforward. A more straightforward implementation
would do something like `hasattr(a, '__call__')`.
For example:
Python 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:44:40) [MSC v.1600 64
> bit (AMD64)] on win32
> Type "help", "copyright", "credits" or "license" for more information.
> >>> callable
> <built-in function callable>
> >>> class A:
> ... @property
> ... def __call__(self):
> ... raise AttributeError('go away')
> ...
> >>> a = A()
> >>> a
> <__main__.A object at 0x000000000365B5C0>
> >>> a.__call__
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File "<stdin>", line 4, in __call__
> AttributeError: go away
> >>> callable(a)
> True
> >>> # it should be False :(
>
So it boils down to this:
> >>> hasattr(a, "__call__")
> False
> >>> hasattr(type(a), "__call__")
> True
My issue is that I didn't call `callable(type(a))` but just `callable(a)`.
Clearly mismatching what happens when you do hasattr(a, "__call__").
To put in contrast, this is legal and clearly indicates the descriptors are
being used as expected:
>>> class B:
> ... @property
> ... def __call__(self):
> ... return lambda: 1
> ...
> >>> b = B()
> >>> b()
> 1
>
There
's some more discussing in issue 23990 <http://bugs.python.org/issue23990>
where I get slightly angry, sorry.
So were is this change actually useful? Proxies! Since new-style objects
in Python you cannot really proxy the callable aspect of objects, because
`callable` just checks that a field is set in a C struct.
This is fairly inconvenient because you have to know upfront if your
target is going to be callable or not.
Thanks,
-- Ionel Cristian Mărieș, http://blog.ionelmc.ro
Hello everyone,
I'd like to propose a little bit of syntactic sugar: allowing with-blocks to be
followed by except- and finally-blocks just like try-blocks. For example:
with open('spam.txt') as file:
print(file.read())
except IOError:
print('No spam here...')
finally:
print('done.')
This proposed syntax is semantically equivalent to wrapping a with-block within
a try-block like so:
try:
with open('spam.txt') as file:
print(file.read())
finally:
print('done.')
except IOError:
print('No spam here...')
I see two advantages to the proposed syntax. First and most obviously, it saves
an extra line and an extra indentation level. One line may not be a big deal,
but one indentation level can really affect readability. Second and more
conceptually, it makes sense to think about exception handling in the context of
a with-block. More often than not, if you're using a with-block, you're
expecting that something in that block could throw an exception. Usually
with-blocks are used to make sure resources (e.g. files, database sessions,
mutexes, etc.) are properly closed before the exception is propogated. But I
very often want to do some custom clean-up as well (alert the user, etc.).
Currently that requires wrapping the whole thing in a try-block, but allowing
with-blocks to behave as try-blocks is a more direct way to express what is meant.
I was curious how often with-blocks are actually wrapped in try-blocks for no
other purpose than catching exceptions raised in the with-block. So I searched
through a number of open source projects looking (roughly) for that pattern:
Project with [1] try-with [2]
============== ======== ============
django 230 17
ipython 541 8
matplotlib 112 3
moinmoin 10 0
numpy 166 1
pillow/pil 1 0
pypy 254 4
scipy 163 2
sqlalchemy 36 0
twisted 72 1
============== ======== ============
total 1585 36 (2.27%)
[1]: grep -Po '^\s*with .*:' **/*.py
[2]: grep -Poz 'try:\s*with .*:' **/*.py
Assuming these projects are representative, about 2% of the with-blocks are
directly wrapped by try-blocks. That's not a huge proportion, but it clearly
shows that this pattern is being used "in the wild". Whether or not it's worth
changing the language for the benefit of 2% of with-blocks is something to
debate though.
What do people think of this idea?
-Kale Kundert
1. Overall I like the proposal very much. However, I have got one
semantic remark. You propose `async for` as a syntax for asynchronous
iterators:
async for row in Cursor():
print(row)
Wouldn't it be more semantically correct to use `await for` instead of
`async for`?
await for row in Cursor():
print(row)
For me the word 'await' is an indicator that I am awaiting for some
value being returned. For example, with simple `await` expression I am
awaiting for a data being fetched from db:
data = await db.fetch('SELECT ...')
When I use asynchronous iterator I am awaiting for a value being
returned as well. For example I am awaiting (in each iteration) for a
row from a cursor. Therefore, it seems to me to be natural to use word
'await' instead of 'async'. Furthermore syntax 'await for row in cursor'
reassembles natural English language.
On the other hand, when I use context manager, I am not awaiting for any
value, so syntax `async with` seems to be proper in that case:
async with session.transaction():
...
await session.update(data)
Dart, for example, goes that way. They use `await` expression for
awaiting single Future and `await for` statement for asynchronous iterators:
await for (variable declaration in expression) {
// Executes each time the stream emits a value.
}
2. I would like to go little beyond this proposal and think about
composition of async coroutines (aka waiting for multiple coroutines).
For example C# has helper functions WhenAll and WhenAny for that:
await Task.WhenAll(tasks_list);
await Task.WhenAny(tasks_list);
In asyncio module there is a function asyncio.wait() which can be used
to achieve similar result:
asyncio.wait(fs, timeout=None, return_when=ALL_COMPLETED)
asyncio.wait(fs, timeout=None, return_when=FIRST_COMPLETED)
However, after introduction of `await` its name becomes problematic.
First, it reassembles `await` too much and can cause a confusion.
Second, its usage would result in an awkward 'await wait':
done, pending = await asyncio.wait(coroutines_list)
results = []
for task in done:
results.append(task.result())
Another problem with asyncio.wait() is that it returns Tasks, not their
results directly, so user has to unpack them. There is function
asyncio.gather(*coros_or_futures) which return results list directly,
however it can be only used for ALL_COMPLETED case. There is also a
function asyncio.wait_for() which (unlike asyncio.wait()) unpacks the
result, but can only be used for one coroutine (so what is the
difference from `await` expression?). Finally, there is
asyncio.as_completed() which returns iterator for iterating over
coroutines results as they complete (but I don't know how exactly this
iterator relates to async iterators proposed here).
I can imagine the set of three functions being exposed to user to
control waiting for multiple coroutines:
asynctools.as_done() # returns asynchronous iterator for iterating over
the results of coroutines as they complete
asynctools.all_done() # returns a future aggregating results from the
given coroutine objects, which awaited returns list of results (like
asyncio.gather())
asynctools.any_done() # returns a future, which awaited returns result
of first completed coroutine
Example:
from asynctools import as_done, all_done, any_done
corobj0 = async_sql_query("SELECT...")
corobj1 = async_memcached_get("someid")
corobj2 = async_http_get("http://python.org")
# ------------------------------------------------
# Iterate over results as coroutines complete
# using async iterator
await for result in as_done([corobj0, corobj1, corobj2]):
print(result)
# ------------------------------------------------
# Await for results of all coroutines
# using async iterator
results = []
await for result in as_done([corobj0, corobj1, corobj2]):
results.append(result)
# or using shorthand coroutine all_done()
results = await all_done([corobj0, corobj1, corobj2])
# ------------------------------------------------
# Await for a result of first completed coroutine
# using async iterator
await for result in as_done([corobj0, corobj1, corobj2]):
first_result = result
break
# or using shorthand coroutine any_done()
first_result = await any_done([corobj0, corobj1, corobj2])
I deliberately placed these functions in a new asynctools module, not in
the asyncio module. I find asyncio module being too much complicated to
expose it to an ordinary user. There are four very similar concepts used
in it: Coroutine (function), Coroutine (object), Future and Task. In
addition many functions accept both coroutines and Futures in the same
argument, Task is a subclass of Future -- it makes people very confused.
It is difficult to grasp what are differences between them and how they
relate to each other. For comparison in JavaScript that are only two
concepts: async functions and Promises.
(Furthermore, after this PEP being accepted there will be fifth concept:
old-style coroutines. And there are also concurrent.futures.Futures...)
Personally, I think that asyncio module should be refactored and broken
into two separate modules, named for example:
- asyncloop # consisting low-level loop-related things, mostly not
intended to be used by the average user (apart from get_event_loop() and
run_until_xxx())
- asynctools # consisting high-level helper functions, like described before
As with this PEP async/await will become first class member of Python
environment, all rest high-level functions should be in my opinion moved
from asyncio to appropriate modules, like socket or subprocess. These
are the places where users will be looking for them. For example:
socket.socket.recv()
socket.socket.recv_async()
socket.socket.sendall()
socket.socket.sendall_async()
socket.getaddrinfo()
socket.getaddrinfo_async()
Finally, concurrent.futures should either be renamed to avoid the usage
of word 'future', or be made compatible with async/await.
I know that I went far beyond scope of this PEP, but I think that these
are the issues which will pop up after acceptance of this PEP sooner or
later.
Finally, I remind about my proposal from the beginning of this email, to
use `await for` instead of `async for` for asynchronous iterators.
What's your opinion about that?
Piotr
I've posted a new draft of PEP 484, but in order to reach the right
audience from now on I'm posting to python-dev. See you there! Please don't
reply here, it's best to have all discussion in one place.
--
--Guido van Rossum (python.org/~guido)