async/await in Python; v2

Hi python-dev, I'm moving the discussion from python-ideas to here. The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email. Updates: 1. CO_ASYNC flag was renamed to CO_COROUTINE; 2. sys.set_async_wrapper() was renamed to sys.set_coroutine_wrapper(); 3. New function: sys.get_coroutine_wrapper(); 4. types.async_def() renamed to types.coroutine(); 5. New section highlighting differences from PEP 3152. 6. New AST node - AsyncFunctionDef; the proposal now is 100% backwards compatible; 7. A new section clarifying that coroutine-generators are not part of the current proposal; 8. Various small edits/typos fixes. There's is a bug tracker issue to track code review of the reference implementation (Victor Stinner is doing the review): http://bugs.python.org/issue24017 While the PEP isn't accepted, we want to make sure that the reference implementation is ready when such a decision will be made. Let's discuss some open questions: 1. Victor raised a question if we should locate coroutine() function from 'types' module to 'functools'. My opinion is that 'types' module is a better place for 'corotine()', since it adjusts the type of the passed generator. 'functools' is about 'partials', 'lru_cache' and 'wraps' kind of things. 2. I propose to disallow using of 'for..in' loops, and builtins like 'list()', 'iter()', 'next()', 'tuple()' etc on coroutines. It's possible by modifying PyObject_GetIter to raise an exception if it receives a coroutine-object. 'yield from' can also be modified to only accept coroutine objects if it is called from a generator with CO_COROUTINE flag. This will further separate coroutines from generators, making it harder to screw something up by an accident. I have a branch of reference implementation https://github.com/1st1/cpython/tree/await_noiter where this is implemented. I did not observe any performance drop. There is just one possible backwards compatibility issue here: there will be an exception if some user of asyncio actually used to iterate over generators decorated with @coroutine. But I can't imagine why would someone do that, and even if they did -- it's probably a bug or wrong usage of asyncio. That's it! I'd be happy to hear some feedback! Thanks, Yury PEP: 492 Title: Coroutines with async and await syntax Version: $Revision$ Last-Modified: $Date$ Author: Yury Selivanov <yselivanov@sprymix.com> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Apr-2015 Python-Version: 3.5 Post-History: 17-Apr-2015, 21-Apr-2015 Abstract ======== This PEP introduces new syntax for coroutines, asynchronous ``with`` statements and ``for`` loops. The main motivation behind this proposal is to streamline writing and maintaining asynchronous code, as well as to simplify previously hard to implement code patterns. Rationale and Goals =================== Current Python supports implementing coroutines via generators (PEP 342), further enhanced by the ``yield from`` syntax introduced in PEP 380. This approach has a number of shortcomings: * it is easy to confuse coroutines with regular generators, since they share the same syntax; async libraries often attempt to alleviate this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); * it is not possible to natively define a coroutine which has no ``yield`` or ``yield from`` statements, again requiring the use of decorators to fix potential refactoring issues; * support for asynchronous calls is limited to expressions where ``yield`` is allowed syntactically, limiting the usefulness of syntactic features, such as ``with`` and ``for`` statements. This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring. Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new ``async with`` statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new ``async for`` statement makes it possible to perform asynchronous calls in iterators. Specification ============= This proposal introduces new syntax and semantics to enhance coroutine support in Python, it does not change the internal implementation of coroutines, which are still based on generators. It is strongly suggested that the reader understands how coroutines are implemented in Python (PEP 342 and PEP 380). It is also recommended to read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). From this point in this document we use the word *coroutine* to refer to functions declared using the new syntax. *generator-based coroutine* is used where necessary to refer to coroutines that are based on generator syntax. New Coroutine Declaration Syntax -------------------------------- The following new syntax is used to declare a coroutine:: async def read_data(db): pass Key properties of coroutines: * ``async def`` functions are always coroutines, even if they do not contain ``await`` expressions. * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` expressions in an ``async`` function. * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced to enable runtime detection of coroutines (and migrating existing code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` flags set. * Regular generators, when called, return a *generator object*; similarly, coroutines return a *coroutine object*. * ``StopIteration`` exceptions are not propagated out of coroutines, and are replaced with a ``RuntimeError``. For regular generators such behavior requires a future import (see PEP 479). types.coroutine() ----------------- A new function ``coroutine(gen)`` is added to the ``types`` module. It applies ``CO_COROUTINE`` flag to the passed generator-function's code object, making it to return a *coroutine object* when called. This feature enables an easy upgrade path for existing libraries. Await Expression ---------------- The following new ``await`` expression is used to obtain a result of coroutine execution:: async def read_data(db): data = await db.fetch('SELECT ...') ... ``await``, similarly to ``yield from``, suspends execution of ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and returns the result data. It uses the ``yield from`` implementation with an extra step of validating its argument. ``await`` only accepts an *awaitable*, which can be one of: * A *coroutine object* returned from a *coroutine* or a generator decorated with ``types.coroutine()``. * An object with an ``__await__`` method returning an iterator. Any ``yield from`` chain of calls ends with a ``yield``. This is a fundamental mechanism of how *Futures* are implemented. Since, internally, coroutines are a special kind of generators, every ``await`` is suspended by a ``yield`` somewhere down the chain of ``await`` calls (please refer to PEP 3156 for a detailed explanation.) To enable this behavior for coroutines, a new magic method called ``__await__`` is added. In asyncio, for instance, to enable Future objects in ``await`` statements, the only change is to add ``__await__ = __iter__`` line to ``asyncio.Future`` class. Objects with ``__await__`` method are called *Future-like* objects in the rest of this PEP. Also, please note that ``__aiter__`` method (see its definition below) cannot be used for this purpose. It is a different protocol, and would be like using ``__iter__`` instead of ``__call__`` for regular callables. It is a ``SyntaxError`` to use ``await`` outside of a coroutine. Asynchronous Context Managers and "async with" ---------------------------------------------- An *asynchronous context manager* is a context manager that is able to suspend execution in its *enter* and *exit* methods. To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: ``__aenter__`` and ``__aexit__``. Both must return an *awaitable*. An example of an asynchronous context manager:: class AsyncContextManager: async def __aenter__(self): await log('entering context') async def __aexit__(self, exc_type, exc, tb): await log('exiting context') New Syntax '''''''''' A new statement for asynchronous context managers is proposed:: async with EXPR as VAR: BLOCK which is semantically equivalent to:: mgr = (EXPR) aexit = type(mgr).__aexit__ aenter = type(mgr).__aenter__(mgr) exc = True try: try: VAR = await aenter BLOCK except: exc = False exit_res = await aexit(mgr, *sys.exc_info()) if not exit_res: raise finally: if exc: await aexit(mgr, None, None, None) As with regular ``with`` statements, it is possible to specify multiple context managers in a single ``async with`` statement. It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine. Example ''''''' With asynchronous context managers it is easy to implement proper database transaction managers for coroutines:: async def commit(session, data): ... async with session.transaction(): ... await session.update(data) ... Code that needs locking also looks lighter:: async with lock: ... instead of:: with (yield from lock): ... Asynchronous Iterators and "async for" -------------------------------------- An *asynchronous iterable* is able to call asynchronous code in its *iter* implementation, and *asynchronous iterator* can call asynchronous code in its *next* method. To support asynchronous iteration: 1. An object must implement an ``__aiter__`` method returning an *awaitable* resulting in an *asynchronous iterator object*. 2. An *asynchronous iterator object* must implement an ``__anext__`` method returning an *awaitable*. 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` exception. An example of asynchronous iterable:: class AsyncIterable: async def __aiter__(self): return self async def __anext__(self): data = await self.fetch_data() if data: return data else: raise StopAsyncIteration async def fetch_data(self): ... New Syntax '''''''''' A new statement for iterating through asynchronous iterators is proposed:: async for TARGET in ITER: BLOCK else: BLOCK2 which is semantically equivalent to:: iter = (ITER) iter = await type(iter).__aiter__(iter) running = True while running: try: TARGET = await type(iter).__anext__(iter) except StopAsyncIteration: running = False else: BLOCK else: BLOCK2 It is an error to pass a regular iterable without ``__aiter__`` method to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside of a coroutine. As for with regular ``for`` statement, ``async for`` has an optional ``else`` clause. Example 1 ''''''''' With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration:: async for data in cursor: ... Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows of data from a database after every ``N`` iterations. The following code illustrates new asynchronous iteration protocol:: class Cursor: def __init__(self): self.buffer = collections.deque() def _prefetch(self): ... async def __aiter__(self): return self async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopAsyncIteration return self.buffer.popleft() then the ``Cursor`` class can be used as follows:: async for row in Cursor(): print(row) which would be equivalent to the following code:: i = await Cursor().__aiter__() while True: try: row = await i.__anext__() except StopAsyncIteration: break else: print(row) Example 2 ''''''''' The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators. :: class AsyncIteratorWrapper: def __init__(self, obj): self._it = iter(obj) async def __aiter__(self): return self async def __anext__(self): try: value = next(self._it) except StopIteration: raise StopAsyncIteration return value async for item in AsyncIteratorWrapper("abc"): print(item) Why StopAsyncIteration? ''''''''''''''''''''''' Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between :: def g1(): yield from fut return 'spam' and :: def g2(): yield from fut raise StopIteration('spam') And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its ``StopIteration`` wrapped into a ``RuntimeError`` :: async def a1(): await fut raise StopIteration('spam') The only way to tell the outside code that the iteration has ended is to raise something other than ``StopIteration``. Therefore, a new built-in exception class ``StopAsyncIteration`` was added. Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions raised in coroutines are wrapped in ``RuntimeError``. Debugging Features ------------------ One of the most frequent mistakes that people make when using generators as coroutines is forgetting to use ``yield from``:: @asyncio.coroutine def useful(): asyncio.sleep(1) # this will do noting without 'yield from' For debugging this kind of mistakes there is a special debug mode in asyncio, in which ``@coroutine`` decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient ``__repr__`` function with detailed information about the generator. The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, ``@coroutine`` decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is possible to run asyncio programs with asyncio's own functions instrumented. ``EventLoop.set_debug``, a different debug facility, has no impact on ``@coroutine`` decorator's behavior. With this proposal, coroutines is a native, distinct from generators, concept. New methods ``set_coroutine_wrapper`` and ``get_coroutine_wrapper`` are added to the ``sys`` module, with which frameworks can provide advanced debugging facilities. It is also important to make coroutines as fast and efficient as possible, therefore there are no debug features enabled by default. Example:: async def debug_me(): await asyncio.sleep(1) def async_debug_wrap(generator): return asyncio.AsyncDebugWrapper(generator) sys.set_coroutine_wrapper(async_debug_wrap) debug_me() # <- this line will likely GC the coroutine object and # trigger AsyncDebugWrapper's code. assert isinstance(debug_me(), AsyncDebugWrapper) sys.set_coroutine_wrapper(None) # <- this unsets any # previously set wrapper assert not isinstance(debug_me(), AsyncDebugWrapper) If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` unsets the wrapper. Glossary ======== :Coroutine: A coroutine function, or just "coroutine", is declared with ``async def``. It uses ``await`` and ``return value``; see `New Coroutine Declaration Syntax`_ for details. :Coroutine object: Returned from a coroutine function. See `Await Expression`_ for details. :Future-like object: An object with an ``__await__`` method. Can be consumed by an ``await`` expression in a coroutine. A coroutine waiting for a Future-like object is suspended until the Future-like object's ``__await__`` completes, and returns the result. See `Await Expression`_ for details. :Awaitable: A *Future-like* object or a *coroutine object*. See `Await Expression`_ for details. :Generator-based coroutine: Coroutines based in generator syntax. Most common example is ``@asyncio.coroutine``. :Asynchronous context manager: An asynchronous context manager has ``__aenter__`` and ``__aexit__`` methods and can be used with ``async with``. See `Asynchronous Context Managers and "async with"`_ for details. :Asynchronous iterable: An object with an ``__aiter__`` method, which must return an *asynchronous iterator* object. Can be used with ``async for``. See `Asynchronous Iterators and "async for"`_ for details. :Asynchronous iterator: An asynchronous iterator has an ``__anext__`` method. See `Asynchronous Iterators and "async for"`_ for details. List of functions and methods ============================= ================= =================================== ================= Method Can contain Can't contain ================= =================================== ================= async def func await, return value yield, yield from async def __a*__ await, return value yield, yield from def __a*__ return awaitable await def __await__ yield, yield from, return iterable await generator yield, yield from, return value await ================= =================================== ================= Where: * "async def func": coroutine; * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined with the ``async`` keyword; * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined without the ``async`` keyword, must return an *awaitable*; * "def __await__": ``__await__`` method to implement *Future-like* objects; * generator: a "regular" generator, function defined with ``def`` and which contains a least one ``yield`` or ``yield from`` expression. Transition Plan =============== To avoid backwards compatibility issues with ``async`` and ``await`` keywords, it was decided to modify ``tokenizer.c`` in such a way, that it: * recognizes ``async def`` name tokens combination (start of a coroutine); * keeps track of regular functions and coroutines; * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with ``AWAIT`` when in the process of yielding tokens for coroutines. This approach allows for seamless combination of new syntax features (all of them available only in ``async`` functions) with any existing code. An example of having "async def" and "async" attribute in one piece of code:: class Spam: async = 42 async def ham(): print(getattr(Spam, 'async')) # The coroutine can be executed and will print '42' Backwards Compatibility ----------------------- This proposal preserves 100% backwards compatibility. Grammar Updates --------------- Grammar changes are also fairly minimal:: await_expr: AWAIT test await_stmt: await_expr decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: ASYNC funcdef async_stmt: ASYNC (funcdef | with_stmt | for_stmt) compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt) atom: ('(' [yield_expr|await_expr|testlist_comp] ')' | '[' [testlist_comp] ']' | '{' [dictorsetmaker] '}' | NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False’) expr_stmt: testlist_star_expr (augassign (yield_expr|await_expr|testlist) | ('=' (yield_expr|await_expr|testlist_star_expr))*) Transition Period Shortcomings ------------------------------ There is just one. Until ``async`` and ``await`` are not proper keywords, it is not possible (or at least very hard) to fix ``tokenizer.c`` to recognize them on the **same line** with ``def`` keyword:: # async and await will always be parsed as variables async def outer(): # 1 def nested(a=(await fut)): pass async def foo(): return (await fut) # 2 Since ``await`` and ``async`` in such cases are parsed as ``NAME`` tokens, a ``SyntaxError`` will be raised. To workaround these issues, the above examples can be easily rewritten to a more readable form:: async def outer(): # 1 a_default = await fut def nested(a=a_default): pass async def foo(): # 2 return (await fut) This limitation will go away as soon as ``async`` and ``await`` ate proper keywords. Or if it's decided to use a future import for this PEP. Deprecation Plans ----------------- ``async`` and ``await`` names will be softly deprecated in CPython 3.5 and 3.6. In 3.7 we will transform them to proper keywords. Making ``async`` and ``await`` proper keywords before 3.7 might make it harder for people to port their code to Python 3. asyncio ------- ``asyncio`` module was adapted and tested to work with coroutines and new statements. Backwards compatibility is 100% preserved. The required changes are mainly: 1. Modify ``@asyncio.coroutine`` decorator to use new ``types.coroutine()`` function. 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. 3. Add ``ensure_task()`` as an alias for ``async()`` function. Deprecate ``async()`` function. Design Considerations ===================== PEP 3152 -------- PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called "cofunctions"). Some key points: 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is always a generator, even if there is no ``cocall`` expressions inside it. Maps to ``async def`` in this proposal. 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used inside a *cofunction*. Maps to ``await`` in this proposal (with some differences, see below.) 3. It is not possible to call a *cofunction* without a ``cocall`` keyword. 4. ``cocall`` grammatically requires parentheses after it:: atom: cocall | <existing alternatives for atom> cocall: 'cocall' atom cotrailer* '(' [arglist] ')' cotrailer: '[' subscriptlist ']' | '.' NAME 5. ``cocall f(*args, **kwds)`` is semantically equivalent to ``yield from f.__cocall__(*args, **kwds)``. Differences from this proposal: 1. There is no equivalent of ``__cocall__`` in this PEP, which is called and its result is passed to ``yield from`` in the ``cocall`` expression. ``await`` keyword expects an *awaitable* object, validates the type, and executes ``yield from`` on it. Although, ``__await__`` method is similar to ``__cocall__``, but is only used to define *Future-like* objects. 2. ``await`` is defined in almost the same way as ``yield from`` in the grammar (it is later enforced that ``await`` can only be inside ``async def``). It is possible to simply write ``await future``, whereas ``cocall`` always requires parentheses. 3. To make asyncio work with PEP 3152 it would be required to modify ``@asyncio.coroutine`` decorator to wrap all functions in an object with a ``__cocall__`` method. To call *cofunctions* from existing generator-based coroutines it would be required to use ``costart`` built-in. In this proposal ``@asyncio.coroutine`` simply sets ``CO_COROUTINE`` on the wrapped function's code object and everything works automatically. 4. Since it is impossible to call a *cofunction* without a ``cocall`` keyword, it automatically prevents the common mistake of forgetting to use ``yield from`` on generator-based coroutines. This proposal addresses this problem with a different approach, see `Debugging Features`_. 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine is that if is decided to implement coroutine-generators -- coroutines with ``yield`` or ``async yield`` expressions -- we wouldn't need a ``cocall`` keyword to call them. So we'll end up having ``__cocall__`` and no ``__call__`` for regular coroutines, and having ``__call__`` and no ``__cocall__`` for coroutine- generators. 6. There are no equivalents of ``async for`` and ``async with`` in PEP 3152. Coroutine-generators -------------------- With ``async for`` keyword it is desirable to have a concept of a *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` expressions. To avoid any ambiguity with regular generators, we would likely require to have an ``async`` keyword before ``yield``, and ``async yield from`` would raise a ``StopAsyncIteration`` exception. While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP. No implicit wrapping in Futures ------------------------------- There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A key difference is that JavaScript "async functions" always return a Promise. While this approach has some advantages, it also implies that a new Promise object is created on each "async function" invocation. We could implement a similar functionality in Python, by wrapping all coroutines in a Future object, but this has the following disadvantages: 1. Performance. A new Future object would be instantiated on each coroutine call. Moreover, this makes implementation of ``await`` expressions slower (disabling optimizations of ``yield from``). 2. A new built-in ``Future`` object would need to be added. 3. Coming up with a generic ``Future`` interface that is usable for any use case in any framework is a very hard to solve problem. 4. It is not a feature that is used frequently, when most of the code is coroutines. Why "async" and "await" keywords -------------------------------- async/await is not a new concept in programming languages: * C# has it since long time ago [5]_; * proposal to add async/await in ECMAScript 7 [2]_; see also Traceur project [9]_; * Facebook's Hack/HHVM [6]_; * Google's Dart language [7]_; * Scala [8]_; * proposal to add async/await to C++ [10]_; * and many other less popular languages. This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance). Why "__aiter__" is a coroutine ------------------------------ In principle, ``__aiter__`` could be a regular function. There are several good reasons to make it a coroutine: * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` methods are coroutines, users would often make a mistake defining it as ``async`` anyways; * there might be a need to run some asynchronous operations in ``__aiter__``, for instance to prepare DB queries or do some file operation. Importance of "async" keyword ----------------------------- While it is possible to just implement ``await`` expression and treat all functions with at least one ``await`` as coroutines, this approach makes APIs design, code refactoring and its long time support harder. Let's pretend that Python only has ``await`` keyword:: def useful(): ... await log(...) ... def important(): await useful() If ``useful()`` function is refactored and someone removes all ``await`` expressions from it, it would become a regular python function, and all code that depends on it, including ``important()`` would be broken. To mitigate this issue a decorator similar to ``@asyncio.coroutine`` has to be introduced. Why "async def" --------------- For some people bare ``async name(): pass`` syntax might look more appealing than ``async def name(): pass``. It is certainly easier to type. But on the other hand, it breaks the symmetry between ``async def``, ``async with`` and ``async for``, where ``async`` is a modifier, stating that the statement is asynchronous. It is also more consistent with the existing grammar. Why not a __future__ import --------------------------- ``__future__`` imports are inconvenient and easy to forget to add. Also, they are enabled for the whole source file. Consider that there is a big project with a popular module named "async.py". With future imports it is required to either import it using ``__import__()`` or ``importlib.import_module()`` calls, or to rename the module. The proposed approach makes it possible to continue using old code and modules without a hassle, while coming up with a migration plan for future python versions. Why magic methods start with "a" -------------------------------- New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". An alternative proposal is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``. However, to align new magic methods with the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided to use a shorter version. Why not reuse existing magic names ---------------------------------- An alternative idea about new asynchronous iterators and context managers was to reuse existing magic methods, by adding an ``async`` keyword to their declarations:: class CM: async def __enter__(self): # instead of __aenter__ ... This approach has the following downsides: * it would not be possible to create an object that works in both ``with`` and ``async with`` statements; * it would look confusing and would require some implicit magic behind the scenes in the interpreter; * one of the main points of this proposal is to make coroutines as simple and foolproof as possible. Comprehensions -------------- For the sake of restricting the broadness of this PEP there is no new syntax for asynchronous comprehensions. This should be considered in a separate PEP, if there is a strong demand for this feature. Async lambdas ------------- Lambda coroutines are not part of this proposal. In this proposal they would look like ``async lambda(parameters): expression``. Unless there is a strong demand to have them as part of this proposal, it is recommended to consider them later in a separate PEP. Performance =========== Overall Impact -------------- This proposal introduces no observable performance impact. Here is an output of python's official set of benchmarks [4]_: :: python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe [skipped] Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64 i386 Total CPU cores: 8 ### etree_iterparse ### Min: 0.365359 -> 0.349168: 1.05x faster Avg: 0.396924 -> 0.379735: 1.05x faster Significant (t=9.71) Stddev: 0.01225 -> 0.01277: 1.0423x larger The following not significant results are hidden, use -v to show them: django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle, fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. Tokenizer modifications ----------------------- There is no observable slowdown of parsing python files with the modified tokenizer: parsing of one 12Mb file (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount of time. async/await ----------- The following micro-benchmark was used to determine performance difference between "async" functions and generators:: import sys import time def binary(n): if n <= 0: return 1 l = yield from binary(n - 1) r = yield from binary(n - 1) return l + 1 + r async def abinary(n): if n <= 0: return 1 l = await abinary(n - 1) r = await abinary(n - 1) return l + 1 + r def timeit(gen, depth, repeat): t0 = time.time() for _ in range(repeat): list(gen(depth)) t1 = time.time() print('{}({}) * {}: total {:.3f}s'.format( gen.__name__, depth, repeat, t1-t0)) The result is that there is no observable performance difference. Minimum timing of 3 runs :: abinary(19) * 30: total 12.985s binary(19) * 30: total 12.953s Note that depth of 19 means 1,048,575 calls. Reference Implementation ======================== The reference implementation can be found here: [3]_. List of high-level changes and new protocols -------------------------------------------- 1. New syntax for defining coroutines: ``async def`` and new ``await`` keyword. 2. New ``__await__`` method for Future-like objects. 3. New syntax for asynchronous context managers: ``async with``. And associated protocol with ``__aenter__`` and ``__aexit__`` methods. 4. New syntax for asynchronous iteration: ``async for``. And associated protocol with ``__aiter__``, ``__aexit__`` and new built- in exception ``StopAsyncIteration``. 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, ``Await``. 6. New functions: ``sys.set_coroutine_wrapper(callback)``, ``sys.get_coroutine_wrapper()``, and ``types.coroutine(gen)``. 7. New ``CO_COROUTINE`` bit flag for code objects. While the list of changes and new things is not short, it is important to understand, that most users will not use these features directly. It is intended to be used in frameworks and libraries to provide users with convenient to use and unambiguous APIs with ``async def``, ``await``, ``async for`` and ``async with`` syntax. Working example --------------- All concepts proposed in this PEP are implemented [3]_ and can be tested. :: import asyncio async def echo_server(): print('Serving on localhost:8000') await asyncio.start_server(handle_connection, 'localhost', 8000) async def handle_connection(reader, writer): print('New connection...') while True: data = await reader.read(8192) if not data: break print('Sending {:.10}... back'.format(repr(data))) writer.write(data) loop = asyncio.get_event_loop() loop.run_until_complete(echo_server()) try: loop.run_forever() finally: loop.close() References ========== .. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions .. [3] https://github.com/1st1/cpython/tree/await .. [4] https://hg.python.org/benchmarks .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx .. [6] http://docs.hhvm.com/manual/en/hack.async.php .. [7] https://www.dartlang.org/articles/await-async/ .. [8] http://docs.scala-lang.org/sips/pending/async.html .. [9] https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-funct... .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF) Acknowledgments =============== I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, and Łukasz Langa for their initial feedback. Copyright ========= This document has been placed in the public domain.

Yury Selivanov wrote:
1. CO_ASYNC flag was renamed to CO_COROUTINE;
2. sys.set_async_wrapper() was renamed to sys.set_coroutine_wrapper();
3. New function: sys.get_coroutine_wrapper();
4. types.async_def() renamed to types.coroutine();
I still don't like the idea of hijacking the generic term "coroutine" and using it to mean this particular type of object.
2. I propose to disallow using of 'for..in' loops, and builtins like 'list()', 'iter()', 'next()', 'tuple()' etc on coroutines.
PEP 3152 takes care of this automatically from the fact that you can't make an ordinary call to a cofunction, and cocall combines a call and a yield-from. You have to go out of your way to get hold of the underlying iterator to use in a for-loop, etc. -- Greg

Hi Greg, On 2015-04-22 2:05 AM, Greg Ewing wrote:
Yury Selivanov wrote:
1. CO_ASYNC flag was renamed to CO_COROUTINE;
2. sys.set_async_wrapper() was renamed to sys.set_coroutine_wrapper();
3. New function: sys.get_coroutine_wrapper();
4. types.async_def() renamed to types.coroutine();
I still don't like the idea of hijacking the generic term "coroutine" and using it to mean this particular type of object.
In my opinion it's OK. We propose a new dedicated syntax to define coroutines. Old approach to use generators to implement coroutines will, eventually, be obsolete. That's why, in PEP 492, we call 'async def' functions "coroutines", and the ones that are defined with generators "generator-based coroutines". You can also have "greenlets-based coroutines" and "stackless coroutines", but those aren't native Python concepts. I'm not sure if you can apply term "cofunctions" to coroutines in PEP 492. I guess we can call them "async functions".
2. I propose to disallow using of 'for..in' loops, and builtins like 'list()', 'iter()', 'next()', 'tuple()' etc on coroutines.
PEP 3152 takes care of this automatically from the fact that you can't make an ordinary call to a cofunction, and cocall combines a call and a yield-from. You have to go out of your way to get hold of the underlying iterator to use in a for-loop, etc.
On the one hand I like your idea to disallow calling coroutines without a special keyword (await in case of PEP 492). It has downsides, but there is some elegance in it. On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me. I'd be happy to hear others opinion on this topic. Thanks! Yury

On Wed, Apr 22, 2015 at 8:40 AM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
On the one hand I like your idea to disallow calling coroutines without a special keyword (await in case of PEP 492). It has downsides, but there is some elegance in it. On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me.
I'd be happy to hear others opinion on this topic.
I'm slowly warming up to Greg's notion that you can't call a coroutine (or whatever it's called) without a special keyword. This makes a whole class of bugs obvious the moment the code is executed. OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function (perhaps to be renamed to ensure_task() to make way for the async keyword). -- --Guido van Rossum (python.org/~guido)

Hi Guido, On 2015-04-22 11:50 AM, Guido van Rossum wrote:
On Wed, Apr 22, 2015 at 8:40 AM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
On the one hand I like your idea to disallow calling coroutines without a special keyword (await in case of PEP 492). It has downsides, but there is some elegance in it. On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me.
I'd be happy to hear others opinion on this topic.
I'm slowly warming up to Greg's notion that you can't call a coroutine (or whatever it's called) without a special keyword. This makes a whole class of bugs obvious the moment the code is executed.
OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function (perhaps to be renamed to ensure_task() to make way for the async keyword).
If we apply Greg's ideas to PEP 492 we will have the following (Greg, correct me if I'm wrong): 1. '_typeobject' struct will get a new field 'tp_await'. We can reuse 'tp_reserved' for that probably. 2. We'll hack Gen(/ceval.c?) objects to raise an error if they are called directly and have a 'CO_COROUTINE' flag. 3. Task(), create_task() and async() will be modified to call 'coro.__await__(..)' if 'coro' has a 'CO_COROUTINE' flag. 4. 'await' will require parentheses grammatically. That will make it different from 'yield' expression. For instance, I still don't know what would 'await coro(123)()' mean. 5. 'await foo(*a, **k)' will be an equivalent to 'yield from type(coro).__await__(coro, *a, **k)' 6. If we ever decide to implement coroutine-generators -- async def functions with 'await' *and* some form of 'yield' -- we'll need to reverse the rule -- allow __call__ and disallow __await__ on such objects (so that you'll be able to write 'async for item in coro_gen()' instead of 'async for item in await coro_gen()'. To be honest, that's a lot of steps and hacks to make this concept work. I think that 'set_coroutine_wrapper()' solves all these problems while keeping the grammar and implementation simpler. Moreover, it allows to add some additional features to the wrapped coroutines, such as nicer repr() in debug mode (CoroWrapper in asyncio already does that) and other runtime checks. Thanks, Yury

I'd like to suggest another way around some of the issues here, with apologies if this has already been discussed sometime in the past.
From the viewpoint of a Python programmer, there are two distinct reasons for wanting to suspend execution in a block of code:
1. To yield a value from an iterator, as Python generators do today. 2. To cede control to the event loop while waiting for an asynchronous task to make progress in a coroutine. As of today both of these reasons to suspend are supported by the same underlying mechanism, i.e. a "yield" at the end of the chain of "yield from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom of it all there's effectively still a yield as I understand it. I think that the fact that these two concepts use the same mechanism is what leads to the issues with coroutine-generators that Greg and Yury have raised. With that in mind, would it be possible to introduce a second form of suspension to Python to specifically handle the case of ceding to the event loop? I don't know what the implementation complexity of this would be, or if it's even feasible. But roughly speaking, the syntax for this could use "await", and code would look just like it does in the PEP. The semantics of "await <Task>" would be analogous to "yield from <Task>" today, with the difference that the Task would go up the chain of "await"s to the outermost caller, which would typically be asyncio, with some modifications from its form today. Progress would be made via __anext__ instead of __next__. Again, this might be impossible to do, but the mental model for the Python programmer becomes cleaner, I think. Most of the issues around combining generators and coroutines would go away - you could freely use "await" inside a generator since it cedes control to the event loop, not the caller of the generator. All of the "async def"/"await" examples in PEP 492 would work as is. It might also make it easier in the future to add support for async calls insider __getattr__ etc. Thanks for reading! Rajiv

Hi Rajiv, On 2015-04-22 12:53 PM, Rajiv Kumar wrote:
I'd like to suggest another way around some of the issues here, with apologies if this has already been discussed sometime in the past.
From the viewpoint of a Python programmer, there are two distinct reasons for wanting to suspend execution in a block of code:
1. To yield a value from an iterator, as Python generators do today.
2. To cede control to the event loop while waiting for an asynchronous task to make progress in a coroutine.
As of today both of these reasons to suspend are supported by the same underlying mechanism, i.e. a "yield" at the end of the chain of "yield from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom of it all there's effectively still a yield as I understand it.
I think that the fact that these two concepts use the same mechanism is what leads to the issues with coroutine-generators that Greg and Yury have raised.
With that in mind, would it be possible to introduce a second form of suspension to Python to specifically handle the case of ceding to the event loop? I don't know what the implementation complexity of this would be, or if it's even feasible. But roughly speaking, the syntax for this could use "await", and code would look just like it does in the PEP. The semantics of "await <Task>" would be analogous to "yield from <Task>" today, with the difference that the Task would go up the chain of "await"s to the outermost caller, which would typically be asyncio, with some modifications from its form today. Progress would be made via __anext__ instead of __next__.
I think that what you propose is a great idea. However, its implementation will be far more invasive than what PEP 492 proposes. I doubt that we'll be able to make it in 3.5 if we choose this route. BUT: With my latest proposal to disallow for..in loops and iter()/list()-like builtins, the fact that coroutines are based internally on generators is just an implementation detail. There is no way users can exploit the underlying generator object. Coroutine-objects only provide 'send()' and 'throw()' methods, which they would also have with your implementation idea. This gives us freedom to consider your approach in 3.6 if we decide to add coroutine-generators. To make this work we might want to patch inspect.py to make isgenerator() family of functions to return False for coroutines/coroutine-objects. Thanks a lot for the feedback! Yury

For now I can use mix asyncio.coroutines and `async def` functions, I mean I can write `await f()` inside async def to call asyncio.coroutine `f` and vise versa: I can use `yield from g()` inside asyncio.coroutine to call `async def g(): ...`. If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required. On Wed, Apr 22, 2015 at 8:13 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Rajiv,
On 2015-04-22 12:53 PM, Rajiv Kumar wrote:
I'd like to suggest another way around some of the issues here, with apologies if this has already been discussed sometime in the past.
From the viewpoint of a Python programmer, there are two distinct reasons for wanting to suspend execution in a block of code:
1. To yield a value from an iterator, as Python generators do today.
2. To cede control to the event loop while waiting for an asynchronous task to make progress in a coroutine.
As of today both of these reasons to suspend are supported by the same underlying mechanism, i.e. a "yield" at the end of the chain of "yield from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom of it all there's effectively still a yield as I understand it.
I think that the fact that these two concepts use the same mechanism is what leads to the issues with coroutine-generators that Greg and Yury have raised.
With that in mind, would it be possible to introduce a second form of suspension to Python to specifically handle the case of ceding to the event loop? I don't know what the implementation complexity of this would be, or if it's even feasible. But roughly speaking, the syntax for this could use "await", and code would look just like it does in the PEP. The semantics of "await <Task>" would be analogous to "yield from <Task>" today, with the difference that the Task would go up the chain of "await"s to the outermost caller, which would typically be asyncio, with some modifications from its form today. Progress would be made via __anext__ instead of __next__.
I think that what you propose is a great idea. However, its implementation will be far more invasive than what PEP 492 proposes. I doubt that we'll be able to make it in 3.5 if we choose this route.
BUT: With my latest proposal to disallow for..in loops and iter()/list()-like builtins, the fact that coroutines are based internally on generators is just an implementation detail.
There is no way users can exploit the underlying generator object. Coroutine-objects only provide 'send()' and 'throw()' methods, which they would also have with your implementation idea.
This gives us freedom to consider your approach in 3.6 if we decide to add coroutine-generators. To make this work we might want to patch inspect.py to make isgenerator() family of functions to return False for coroutines/coroutine-objects.
Thanks a lot for the feedback!
Yury _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

Andrew, On 2015-04-22 2:32 PM, Andrew Svetlov wrote:
For now I can use mix asyncio.coroutines and `async def` functions, I mean I can write `await f()` inside async def to call asyncio.coroutine `f` and vise versa: I can use `yield from g()` inside asyncio.coroutine to call `async def g(): ...`.
That's another good point that I forgot to add to the list. Thanks for bringing this up.
If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required.
You'll have to use a wrapper that will do the following: async def foo(): return 'spam' @asyncio.coroutine def bar(): what = yield from foo.__await__(foo, *args, **kwargs) # OR: what = yield from await_call(foo, *args, **kwargs) Thanks, Yury

On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Andrew,
On 2015-04-22 2:32 PM, Andrew Svetlov wrote:
For now I can use mix asyncio.coroutines and `async def` functions, I mean I can write `await f()` inside async def to call asyncio.coroutine `f` and vise versa: I can use `yield from g()` inside asyncio.coroutine to call `async def g(): ...`.
That's another good point that I forgot to add to the list. Thanks for bringing this up.
If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required.
You'll have to use a wrapper that will do the following:
async def foo(): return 'spam'
@asyncio.coroutine def bar(): what = yield from foo.__await__(foo, *args, **kwargs) # OR: what = yield from await_call(foo, *args, **kwargs)
If I cannot directly use `yield from f()` with `async def f():` then almost every `yield from` inside asyncio library should be wrapped in `await_call()`. Every third-party asyncio-based library should do the same. Also I expect a performance degradation on `await_call()` calls.
Thanks, Yury
-- Thanks, Andrew Svetlov

On 2015-04-22 2:53 PM, Andrew Svetlov wrote:
On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote: [...]
If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required.
You'll have to use a wrapper that will do the following:
async def foo(): return 'spam'
@asyncio.coroutine def bar(): what = yield from foo.__await__(foo, *args, **kwargs) # OR: what = yield from await_call(foo, *args, **kwargs)
If I cannot directly use `yield from f()` with `async def f():` then almost every `yield from` inside asyncio library should be wrapped in `await_call()`. Every third-party asyncio-based library should do the same.
Also I expect a performance degradation on `await_call()` calls.
I think there is another way... instead of pushing GET_ITER ... YIELD_FROM opcodes, we'll need to replace GET_ITER with another one: GET_ITER_SPECIAL ... YIELD_FROM Where "GET_ITER_SPECIAL (obj)" (just a working name) would check that if the current code object has CO_COROUTINE and the object that you will yield-from has it as well, it would push to the stack the result of (obj.__await__()) Yury

On Wed, Apr 22, 2015 at 10:24 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
On 2015-04-22 2:53 PM, Andrew Svetlov wrote:
On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
[...]
If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required.
You'll have to use a wrapper that will do the following:
async def foo(): return 'spam'
@asyncio.coroutine def bar(): what = yield from foo.__await__(foo, *args, **kwargs) # OR: what = yield from await_call(foo, *args, **kwargs)
If I cannot directly use `yield from f()` with `async def f():` then almost every `yield from` inside asyncio library should be wrapped in `await_call()`. Every third-party asyncio-based library should do the same.
Also I expect a performance degradation on `await_call()` calls.
I think there is another way... instead of pushing
GET_ITER ... YIELD_FROM
opcodes, we'll need to replace GET_ITER with another one:
GET_ITER_SPECIAL ... YIELD_FROM
Where "GET_ITER_SPECIAL (obj)" (just a working name) would check that if the current code object has CO_COROUTINE and the object that you will yield-from has it as well, it would push to the stack the result of (obj.__await__())
GET_ITER_SPECIAL sounds better than wrapper for `coro.__await__()` call.
Yury
-- Thanks, Andrew Svetlov

+1 about Andrew Svetlov proposition: please help to migrate as smoothly as possible to async/await. -- Ludovic Gasc (GMLudo) http://www.gmludo.eu/ 2015-04-22 20:32 GMT+02:00 Andrew Svetlov <andrew.svetlov@gmail.com>:
For now I can use mix asyncio.coroutines and `async def` functions, I mean I can write `await f()` inside async def to call asyncio.coroutine `f` and vise versa: I can use `yield from g()` inside asyncio.coroutine to call `async def g(): ...`.
If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required.
On Wed, Apr 22, 2015 at 8:13 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Rajiv,
On 2015-04-22 12:53 PM, Rajiv Kumar wrote:
I'd like to suggest another way around some of the issues here, with apologies if this has already been discussed sometime in the past.
From the viewpoint of a Python programmer, there are two distinct
reasons
for wanting to suspend execution in a block of code:
1. To yield a value from an iterator, as Python generators do today.
2. To cede control to the event loop while waiting for an asynchronous task to make progress in a coroutine.
As of today both of these reasons to suspend are supported by the same underlying mechanism, i.e. a "yield" at the end of the chain of "yield from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom of it all there's effectively still a yield as I understand it.
I think that the fact that these two concepts use the same mechanism is what leads to the issues with coroutine-generators that Greg and Yury have raised.
With that in mind, would it be possible to introduce a second form of suspension to Python to specifically handle the case of ceding to the event loop? I don't know what the implementation complexity of this would be, or if it's even feasible. But roughly speaking, the syntax for this could use "await", and code would look just like it does in the PEP. The semantics of "await <Task>" would be analogous to "yield from <Task>" today, with the difference that the Task would go up the chain of "await"s to the outermost caller, which would typically be asyncio, with some modifications from its form today. Progress would be made via __anext__ instead of __next__.
I think that what you propose is a great idea. However, its implementation will be far more invasive than what PEP 492 proposes. I doubt that we'll be able to make it in 3.5 if we choose this route.
BUT: With my latest proposal to disallow for..in loops and iter()/list()-like builtins, the fact that coroutines are based internally on generators is just an implementation detail.
There is no way users can exploit the underlying generator object. Coroutine-objects only provide 'send()' and 'throw()' methods, which they would also have with your implementation idea.
This gives us freedom to consider your approach in 3.6 if we decide to add coroutine-generators. To make this work we might want to patch inspect.py to make isgenerator() family of functions to return False for coroutines/coroutine-objects.
Thanks a lot for the feedback!
Yury _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe:
https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com

On 23/04/2015 6:32 a.m., Andrew Svetlov wrote:
If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required.
As I suggested earlier, a way could be provided to mark a function as callable using either yield from f() or await f(). That would water down the error catching ability a bit, but it would allow interoperability with existing asyncio code. -- Greg

Hello, On Wed, 22 Apr 2015 09:53:39 -0700 Rajiv Kumar <rajiv.kumar@gmail.com> wrote:
I'd like to suggest another way around some of the issues here, with apologies if this has already been discussed sometime in the past.
From the viewpoint of a Python programmer, there are two distinct reasons for wanting to suspend execution in a block of code:
1. To yield a value from an iterator, as Python generators do today.
2. To cede control to the event loop while waiting for an asynchronous task to make progress in a coroutine.
As of today both of these reasons to suspend are supported by the same underlying mechanism, i.e. a "yield" at the end of the chain of "yield from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom of it all there's effectively still a yield as I understand it.
I think that the fact that these two concepts use the same mechanism is what leads to the issues with coroutine-generators that Greg and Yury have raised.
With that in mind, would it be possible to introduce a second form of suspension to Python to specifically handle the case of ceding to the event loop?
Barring adding an adhoc statement "yield_to_a_main_loop", there's a generic programming device to do it: symmetric coroutines. But it's unlikely to help with your sentiment that the same device is used for different purposes. At least with asymmetric coroutines as currently in Python, you have next() to "call" a coroutine and yield to "return" from. With symmetric coroutines, you don't have a place to return - you can only "call" another coroutine, and then have freedom to call any (including a main loop), but need to bother to always know whom you want to call. But I guess it's already sounds confusing enough for folks who didn't hear about symmetric coroutines, whereas call/return paradigm is much more familiar and understandable. That's certainly why Python implements asymmetric model. And having both asymmetric and symmetric would quite confusing, especially that symmetric are more powerful and asymmetric can be easily implemented in terms of symmetric using continuation-passing style. On the last occurrence of "easily" mere users of course start to run away, shouting that if they wanted to use Scheme, they'd have taken classes on it and used long ago. So, the real problem with dichotomy you describe above is not technical, but rather educational/documentational. And the current approach asyncio takes is "you should not care if a coroutine yields to main loop, or how it is done". Actually, the current approach is to forbid and deny you knowledge how it is done, quoting Victor Stinner from another mail: "(A) asyncio coroutine in Python 3.4: use yield from, yield denied". So, just pretend that there's no yield, only yield from, problem solved. But people know there's yield - they knew it for long time before "yield from". And there're valid usages for yield in a coroutine, like implementing your, application-level generator/generation. Currently, any generation ability is usurped by asyncio's main loop. Much better approach IMHO is given in David Beazley's presentations on generators and coroutines, http://www.dabeaz.com/generators/ . He says that coroutines provided by framework are essentially "system calls". And that's why you don't want to know how they work, and shouldn't care - because users usually don't care how OS kernel implements system calls while sitting in the web browser. But if you want, you can, and you will discover that they're implemented by yield'ing objects of a special class. That's why you're *suggested* not to use yield's in coroutines - because if you want to catch yours, application-level, yields, you may also get any time a system yield object. You would need to expect such possibility, filter such yields and pass them up (re-yield). But there's no forbidden magic in all that, and understanding that helps a lot IMHO. -- Best regards, Paul mailto:pmiscml@gmail.com

Paul Sokolovsky wrote:
And having both asymmetric and symmetric would quite confusing, especially that symmetric are more powerful and asymmetric can be easily implemented in terms of symmetric using continuation-passing style.
You can also use a trampoline of some kind to relay values back and forth between generators, to get something symmetric. -- Greg

Hello, On Thu, 23 Apr 2015 20:39:51 +1200 Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Paul Sokolovsky wrote:
And having both asymmetric and symmetric would quite confusing, especially that symmetric are more powerful and asymmetric can be easily implemented in terms of symmetric using continuation-passing style.
You can also use a trampoline of some kind to relay values back and forth between generators, to get something symmetric.
Yes, that's of course how coroutine frameworks were done long before "yield from" appeared and how Trollius works now. But this just proves point given to the original subtopic starter - that Python already has powerful enough machinery to achieve functionality needed for "yield to asyncio main loop", and adding something specifically for that will only make situation more complicated (because what is asyncio main loop? Just a random user-level function/coroutine, if you need to support yielding "directly" to it, you need to supporting yielding directly to an arbitrary coroutine).
-- Greg
-- Best regards, Paul mailto:pmiscml@gmail.com

Paul Sokolovsky wrote:
Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
You can also use a trampoline of some kind to relay values back and forth between generators, to get something symmetric.
Yes, that's of course how coroutine frameworks were done long before "yield from" appeared and how Trollius works now.
No, what I mean is that if you want to send stuff back and forth between two particular coroutines in a symmetric way, you can write a specialised scheduler that just handles those coroutines. If you want to do that at the same time that other things are going on, I think you're better off not trying to do it using yield. Use a general scheduler such as asyncio, and some traditional IPC mechanism such as a queue for communication. -- Greg

On 04/23/2015 04:18 AM, Yury Selivanov wrote:
2. We'll hack Gen(/ceval.c?) objects to raise an error if they are called directly and have a 'CO_COROUTINE' flag.
By "Gen", do you mean the generator-function or the generator-iterator? That flag has to be on the generator-function, not the generator-iterator, otherwise by the time ceval sees it, the call that should have been forbidden has already been made. To make this work without flagging the function, it would be necessary to check the result of every function call that wasn't immediately awaited and raise an exception if it were awaitable. But that would mean awaitable objects not being fully first-class citizens, since there would be some perfectly reasonable things that you can't do with them. I suspect it would make writing the kernel of a coroutine-scheduling system such as asyncio very awkward, perhaps impossible, to write in pure Python.
3. Task(), create_task() and async() will be modified to call 'coro.__await__(..)' if 'coro' has a 'CO_COROUTINE' flag.
Or, as I pointed out earlier, the caller can wrap the argument in something equivalent to costart().
4. 'await' will require parentheses grammatically. That will make it different from 'yield' expression. For instance, I still don't know what would 'await coro(123)()' mean.
In PEP 3152, cocall binds to the nearest set of function-calling parens, so 'cocall f()()' is parsed as '(cocall f())()'. If you want it the other way, you have to write it as 'cocall (f())()'. I know that's a somewhat arbitrary thing to remember, and it makes chained function calls a bit harder to write and read. But chaining calls like that is a fairly rare thing to do, in contrast with using a call expression as an argument to another call, which is very common. That's not the only case, either. Just about any unparenthesised use of yield-from other than the sole contents of the RHS of an assignment seems to be disallowed. All of these are currently syntax errors, for example: yield from f(x) + yield from g(x) x + yield from g(x) [yield from f(x)]
5. 'await foo(*a, **k)' will be an equivalent to 'yield from type(coro).__await__(coro, *a, **k)'
Again, I'm not sure whether you're proposing to make the functions the await-able objects rather than the iterators (which would effectively be PEP 3152 with __cocall__ renamed to __await__) or something else. I won't comment further on this point until that's clearer.
6. If we ever decide to implement coroutine-generators -- async def functions with 'await' *and* some form of 'yield' -- we'll need to reverse the rule -- allow __call__ and disallow __await__ on such objects (so that you'll be able to write 'async for item in coro_gen()' instead of 'async for item in await coro_gen()'.
Maybe. I haven't thought that idea through properly yet. Possibly the answer is that you define such a function using an ordinary "def", to match the way it's called. The fact that it's an async generator is then indicated by the fact that it contains "async yield". -- Greg

Hi, Guido van Rossum <guido <at> python.org> writes:
I'm slowly warming up to Greg's notion that you can't call a coroutine (or whatever it's called) without a special keyword.
A huge part of the asyncio module is based on "yield from fut" where fut is a Future object. How do you write this using the PEP 3152? Do you need to call an artifical method like "cocall fut.return_self()" where the return_self() method simply returns fut? When I discovered Python for the first time, I fall into the trap of trying to call a function without parenthesis: "hello_world" instead of "hello_world()". I was very surprised that the language didn't protect me against such obvious bug. But later, I used this feature in almost all my applications: passing a callback is just a must have feature, and Python syntax for this is great! (just pass the function without parenthesis) Would it be possible that creating coroutine object by calling a coroutine function is a feature, and not a bug? I mean that it may be used in some cases. I worked a lot of asyncio and I saw a lot of hacks to solve some corner case issues, to be able to propose a nice API at the end. @asyncio.coroutine currently calls a function and *then* check if it should yields from it or not: res = func(*args, **kw) if isinstance(res, futures.Future) or inspect.isgenerator(res): res = yield from res With the PEP 3152, it's no more possible to write such code. I fear that we miss cases where it would be needed. maybeDeferred() is an important function in Twisted. As expected, users ask for a similar function in asyncio: http://stackoverflow.com/questions/20730248/maybedeferred-analog-with-asynci... Currently, it's possible to implement it using yield from.
OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function (perhaps to be renamed to ensure_task() to make way for the async keyword).
Logging a warning when a coroutine object is not "consumed" ("yielded from"?) is only one of the asyncio.CoroWrapper features. It's now also used to remember where the coroutine object was created: it's very useful to rebuild the chain of function calls/tasks/coroutines to show where the bug comes from. (I still have a project to enhance debugging to create a full stack where a task was created. Currently, the stack stops at the last "Task._step()", but it's technically possible to go further (I have a PoC somewhere). I already introduced BaseEventLoop._current_handle as a first step.) Oh, and CoroWrapper also provides a better representation. But we might enhance repr(coroutine_object) directly in Python. Yury proposed to store the source (filename, line number) of the most recent frame where a coroutine object was created. But a single frame is not enough (usually, the interesting frame is at least the 3rd frame, not the most recent one). Storing more frames would kill performances in debug mode (and/or create reference cycles if we keep frame objects, not only filename+line number). For all these reasons, I'm in favor of keeping the ability of wrapping coroutine objects. It has a negligible impact in release mode and you can do whatever you want in debug mode which is very convenient. Victor

Victor Stinner wrote:
A huge part of the asyncio module is based on "yield from fut" where fut is a Future object.
How do you write this using the PEP 3152? Do you need to call an artifical method like "cocall fut.return_self()" where the return_self() method simply returns fut?
In a PEP 3152 world, Future objects and the like would be expected to implement __cocall__, just as in a PEP 492 world they would be expected to implement __await__.
@asyncio.coroutine currently calls a function and *then* check if it should yields from it or not:
res = func(*args, **kw) if isinstance(res, futures.Future) or inspect.isgenerator(res): res = yield from res
To accommodate the possibility of func being a cofunction, you would need to add something like if is_cofunction(func): res = yield from costart(func, *args, **kw) else: # as above -- Greg

Guido van Rossum wrote:
On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function
That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g. codef my_task_func(arg): ... my_task = Task(costart(my_task_func, arg)) If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed. -- Greg

On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Guido van Rossum wrote:
On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function
That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g.
codef my_task_func(arg): ...
my_task = Task(costart(my_task_func, arg))
If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed.
Hm, that feels backwards incompatible (since currently I can write Task(my_task_func(arg)) and also a step backwards in elegance (having to pass the args separately). OTOH the benefit is that it's much harder to accidentally forget to wait for a coroutine. And maybe the backward compatibility issue is not really a problem because you have to opt in by using codef or async def. So I'm still torn. :-) Somebody would need to take a mature asyncio app and see how often this is used (i.e. how many place would require adding costart() as in the above example). -- --Guido van Rossum (python.org/~guido)

On 2015-04-22 8:35 PM, Guido van Rossum wrote:
On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Guido van Rossum wrote:
On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function
That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g.
codef my_task_func(arg): ...
my_task = Task(costart(my_task_func, arg))
If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed.
Hm, that feels backwards incompatible (since currently I can write Task(my_task_func(arg)) and also a step backwards in elegance (having to pass the args separately).
OTOH the benefit is that it's much harder to accidentally forget to wait for a coroutine. And maybe the backward compatibility issue is not really a problem because you have to opt in by using codef or async def.
So I'm still torn. :-)
Somebody would need to take a mature asyncio app and see how often this is used (i.e. how many place would require adding costart() as in the above example).
Somewhere in this thread Victor Stinner wrote: """A huge part of the asyncio module is based on "yield from fut" where fut is a Future object.""" So how would we do "await fut" if await requires parentheses? I think that the problem of forgetting 'yield from' is a bit exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's it, it has never happened since. Yury

On Wed, Apr 22, 2015 at 5:55 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
On 2015-04-22 8:35 PM, Guido van Rossum wrote:
On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Guido van Rossum wrote:
On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to
wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function
That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g.
codef my_task_func(arg): ...
my_task = Task(costart(my_task_func, arg))
If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed.
Hm, that feels backwards incompatible (since currently I can write Task(my_task_func(arg)) and also a step backwards in elegance (having to pass the args separately).
OTOH the benefit is that it's much harder to accidentally forget to wait for a coroutine. And maybe the backward compatibility issue is not really a problem because you have to opt in by using codef or async def.
So I'm still torn. :-)
Somebody would need to take a mature asyncio app and see how often this is used (i.e. how many place would require adding costart() as in the above example).
Somewhere in this thread Victor Stinner wrote:
"""A huge part of the asyncio module is based on "yield from fut" where fut is a Future object."""
So how would we do "await fut" if await requires parentheses?
We could make Future a valid co-callable object.
I think that the problem of forgetting 'yield from' is a bit exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's it, it has never happened since.
Maybe, but it *is* a part of everybody's learning curve. -- --Guido van Rossum (python.org/~guido)

On 2015-04-22 9:04 PM, Guido van Rossum wrote:
On Wed, Apr 22, 2015 at 5:55 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
On 2015-04-22 8:35 PM, Guido van Rossum wrote:
On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Guido van Rossum wrote:
On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to
wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function
That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g.
codef my_task_func(arg): ...
my_task = Task(costart(my_task_func, arg))
If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed.
Hm, that feels backwards incompatible (since currently I can write Task(my_task_func(arg)) and also a step backwards in elegance (having to pass the args separately).
OTOH the benefit is that it's much harder to accidentally forget to wait for a coroutine. And maybe the backward compatibility issue is not really a problem because you have to opt in by using codef or async def.
So I'm still torn. :-)
Somebody would need to take a mature asyncio app and see how often this is used (i.e. how many place would require adding costart() as in the above example).
Somewhere in this thread Victor Stinner wrote:
"""A huge part of the asyncio module is based on "yield from fut" where fut is a Future object."""
So how would we do "await fut" if await requires parentheses?
We could make Future a valid co-callable object.
So you would have to write 'await fut()'? This is non-intuitive. To make Greg's proposal work it'd be a *requirement* for 'await' (enforced by the grammar!) to have '()' after it. Yury

Yury Selivanov wrote:
So you would have to write 'await fut()'? This is non-intuitive.
That's because PEP 492 and its terminology encourage you to think of 'await f()' as a two-step process: evaluate f(), and then wait for the thing it returns to produce a result. PEP 3152 has a different philosophy. There, 'cocall f()' is a one-step process: call f and get back a result (while being prepared to get suspended in the meantime). The two-step approach has the advantage that you can get hold of the intermediate object and manipulate it. But I don't see much utility in being able to do that. Keep in mind that you can treat cofunctions themselves as objects to be manipulated, just like you can with ordinary functions, and all the usual techniques such as closures, * and ** parameters, etc. are available if you want to encapsulate one with some arguments. About the only thing you gain from being able to pass generator-iterators around instead of the functions that produce them is that you get to write t = Task(func(args)) instead of t = Task(func, args) which seems like a very minor thing to me. I would even argue that the latter is clearer, because it makes it very obvious that the body of func is *not* executed before the Task is constructed. The former makes it look as though the *result* of executing func with args is being passed to Task, rather than func itself. -- Greg

Yury Selivanov wrote:
So how would we do "await fut" if await requires parentheses?
I've answered this with respect to PEP 3152 -- futures would implement __cocall__, so you would write 'cocall fut()'. I'm not sure what to say about PEP 492 here, because it depends on exactly what a version of await that "requires parentheses" would mean. It's not clear to me what you have in mind for that from what you've said so far. I'm not really in favour of just tweaking the existing PEP 492 notion of await so that it superficially resembles a PEP 3152 cocall. That misses the point, which is that a cocall is a special kind of function call, not a special kind of yield-from. -- Greg

Yury Selivanov wrote:
I think that the problem of forgetting 'yield from' is a bit exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's it, it has never happened since.
I think it's more likely to happen when you start with an ordinary function, then discover that it needs to be suspendable, so you need to track down all the places that call it, and all the places that call those, etc. PEP 3152 ensures that you get clear diagnostics if you miss any. -- Greg

Greg, how waiting for multiple cocalls should look and work? In asyncio when I need to wait for two and more coroutines/futures I use `asyncio.gather()`: yield from gather(coro1(a1, a2), coro2(), fut3)
From my understanding to use cofunctions I must wrap it with costart call:
yield from gather(costart(coro1, a1, a2), costart(coro2), fut3) That looks weird. There are other places in asyncio API those accept coroutines or futures as parameters, not only Task() and async(). On Thu, Apr 23, 2015 at 12:25 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Yury Selivanov wrote:
I think that the problem of forgetting 'yield from' is a bit exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's it, it has never happened since.
I think it's more likely to happen when you start with an ordinary function, then discover that it needs to be suspendable, so you need to track down all the places that call it, and all the places that call those, etc. PEP 3152 ensures that you get clear diagnostics if you miss any.
-- Greg
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

Andrew Svetlov wrote:
From my understanding to use cofunctions I must wrap it with costart call:
yield from gather(costart(coro1, a1, a2), costart(coro2), fut3)
There are other places in asyncio API those accept coroutines or futures as parameters, not only Task() and async().
In a PEP 3152 aware version of asyncio, they would all know about cofunctions and what to do with them. -- Greg

On Thu, Apr 23, 2015 at 3:10 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Andrew Svetlov wrote:
From my understanding to use cofunctions I must wrap it with costart call:
yield from gather(costart(coro1, a1, a2), costart(coro2), fut3)
There are other places in asyncio API those accept coroutines or futures as parameters, not only Task() and async().
In a PEP 3152 aware version of asyncio, they would all know about cofunctions and what to do with them.
But we already have asyncio and code based on asyncio coroutines. To make it work I should always use costart() in places where asyncio requires coroutine. Maybe your proposal is better than current asyncio practice. But now asyncio is built on top of two-step process, as you have mentioned: building coroutine and waiting for it's result. That's why I prefer `await` as replace for well-known `yield from`.
-- Greg _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

Andrew Svetlov wrote:
But we already have asyncio and code based on asyncio coroutines. To make it work I should always use costart() in places where asyncio requires coroutine.
As I understand it, asyncio would require changes to make it work seamlessly with PEP 492 as well, since an object needs to have either a special flag or an __await__ method before it can have 'await' applied to it. -- Greg

On Fri, Apr 24, 2015 at 3:14 AM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Andrew Svetlov wrote:
But we already have asyncio and code based on asyncio coroutines. To make it work I should always use costart() in places where asyncio requires coroutine.
As I understand it, asyncio would require changes to make it work seamlessly with PEP 492 as well, since an object needs to have either a special flag or an __await__ method before it can have 'await' applied to it.
PEP 492 requires a change of asyncio.Future only. PEP 3152 requires of change in any asyncio-based library, this is the difference.
-- Greg _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

On 2015-04-23 8:10 AM, Greg Ewing wrote:
Andrew Svetlov wrote:
From my understanding to use cofunctions I must wrap it with costart call:
yield from gather(costart(coro1, a1, a2), costart(coro2), fut3)
There are other places in asyncio API those accept coroutines or futures as parameters, not only Task() and async().
In a PEP 3152 aware version of asyncio, they would all know about cofunctions and what to do with them.
What do you mean by that? In a PEP 3152 aware version of asyncio, it's just *not possible to write* cocall gather(coro1(1,2), coro(2,3)) you just have to use your 'costart' built-in: cocall gather(costart(coro1, 1, 2), costart(coro, 2,3)). That's all. That's PEP 3152-aware world. Somehow you think that it's OK to write cocall fut() # instead of just cocall fut() *But it's not*. A huge amount of existing code won't work. You won't be able to migrate it to new syntax easily. If you have a future object 'fut', it's not intuitive or pythonic to write 'cocall fut()'. PEP 3152 was created in pre-asyncio era, and it shows. It's just not gonna work. I know because I designed PEP 492 with a reference implementation at hand, tuning the proposal to make it backwards compatible and on the other hand to actually improve things. Your idea of syntaticaly forcing to use 'cocall' with parens is cute, but it breaks so many things and habits that it just doesn't worth it. Yury

Yury Selivanov wrote:
If you have a future object 'fut', it's not intuitive or pythonic to write 'cocall fut()'.
Another way to approach that would be to provide a cofunction await() used like this: cocall await(fut) That would read more naturally and wouldn't require modifying fut at all. -- Greg

Yury Selivanov wrote:
In a PEP 3152 aware version of asyncio, it's just *not possible to write*
cocall gather(coro1(1,2), coro(2,3))
you just have to use your 'costart' built-in:
cocall gather(costart(coro1, 1, 2), costart(coro, 2,3)).
Another way to write that would be cocall gather(Task(coro1, 1, 2), Task(coro, 2, 3)) I think that actually reads quite nicely, and makes it very clear that parallel tasks are being spawned, rather than invoked sequentially. With the current way, that's not clear at all. It's not quite as convenient, because you don't get currying for free the way you do with generators. But I feel that such implicit currying is detrimental to readability. It looks like you're passing the results returned by coro1 and coro2 to gather, rather than coro1 and coro2 themselves. Yes, it will require some code to be changed, but if you're turning all your coroutines into cofunctions or async defs, you're changing quite a lot of things already.
PEP 3152 was created in pre-asyncio era, and it shows.
I would say that asyncio was created in a pre-PEP-3152 world. Or at least it was developed without allowing for the possibility of adopting something like PEP 3152 in the future. Asyncio was based on generators and yield-from because it was the best thing we had at the time. I'll be disappointed if we've raced so far ahead with those ideas that it's now impossible to replace them with anything better. PEP 3152 is designed to present a *simpler* model of coroutine programming, by having only one concept, the suspendable function, instead of two -- generator functions on the one hand, and iterators/futures/awaitables/ whatever you want to call them on the other. PEP 492 doesn't do that. It adds some things and changes some things, but it doesn't simplify anything.
Your idea of syntaticaly forcing to use 'cocall' with parens is cute,
You say that as though "forcing" the use of parens were a goal in itself. It's not -- it's a *consequence* of what a cocall is. -- Greg

On Thu, Apr 23, 2015 at 3:35 AM, Guido van Rossum <guido@python.org> wrote:
On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Guido van Rossum wrote:
On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function
That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g.
codef my_task_func(arg): ...
my_task = Task(costart(my_task_func, arg))
If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed.
Hm, that feels backwards incompatible (since currently I can write Task(my_task_func(arg)) and also a step backwards in elegance (having to pass the args separately).
OTOH the benefit is that it's much harder to accidentally forget to wait for a coroutine. And maybe the backward compatibility issue is not really a problem because you have to opt in by using codef or async def.
So I'm still torn. :-)
Somebody would need to take a mature asyncio app and see how often this is used (i.e. how many place would require adding costart() as in the above example).
I have not found fresh patch for 3152 to play with, but at least aiohttp [1] library very often creates new tasks by `async(coro(...))` call. The same for aiozmq, aioredis, sockjs (aiohttp-based library for sock.js), aiokafka etc. My applications created for my job also has a `async(...)` calls or direct `Task(f(arg))` creations -- the numbers are between 3 and 10 usage lines per application. Not a big deal to fix them all but it's backward incompatibility. In opposite, I've finished experimental branch [2] of aiomysql library (asyncio driver for MySQL database) with support for `async for` and `async with`. The main problem with public released version is impossibility to handle transactions (requires async context manager) and iteration with async fetching data from cursor (required for server-side cursors for example). Now both problems are solved with keeping full backward compatibility. The library can be used with Python 3.3+ but obviously no new features are available for old Pythons. I use asyncio coroutines, not async functions, e.g.: class Cursor: # ... @asyncio.coroutine def __aiter__(self): return self @asyncio.coroutine def __anext__(self): ret = yield from self.fetchone() if ret is not None: return ret else: raise StopAsyncIteration The whole aiomysql code is correct from Python 3.3+ perspective. For testing new features I use new syntax of in separate test files, test runner will skip test modules with syntax errors on old Python but run those modules on python from PEP 492 branch. Usage example (table 'tbl' is pre-filled, DB engine is connected to server): async def go(engine): async with engine.connect() as conn: async with (await conn.begin()) as tr: await conn.execute("DELETE FROM tbl WHERE (id % 2) = 0") async for row in conn.execute("SELECT * FROM tbl"): print(row['id'], row['name']) [1] https://github.com/KeepSafe/aiohttp [2] https://github.com/aio-libs/aiomysql/tree/await
-- --Guido van Rossum (python.org/~guido)
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

Yury Selivanov wrote:
On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me.
How is it any different from grammatically requiring parens in an ordinary function call? Nobody ever complained about that. In the PEP 3152 way of thinking, a cocall is just a function call that happens to be suspendable. The fact that there is an iterator object involved behind the scenes is an implementation detail. You don't have to think about it or even know about it in order to write or understand suspendable code. It's possible to think about "yield from f(x)" or "await f(x)" that way, but only by exploiting a kind of pun in the code, where you think of f(x) as doing all the work and the rest as a syntactic marker indicating that the call is suspendable. PEP 3152 removes the pun by making this the *actual* interpretation of "cocall f(x)". -- Greg

Greg, On 2015-04-22 7:47 PM, Greg Ewing wrote:
Yury Selivanov wrote:
On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me.
How is it any different from grammatically requiring parens in an ordinary function call? Nobody ever complained about that.
It is different. 1. Because 'await' keyword might be at a great distance from the object you're really calling: await foo.bar.baz['spam']() +-----------------------+ Can I chain the calls: await foo()() ? or await foo().bar()? 2. Because there is no other keyword in python with similar behaviour. 3. Moreover: unless I can write 'await future' - your proposal *won't* work with a lot of existing code and patterns. It's going to be radically different from all other languages that implement 'await' too. Yury

I guess to raise exception on unwinded async generator in destructor even in non-debug mode. Debug mode may have more complex info with source_traceback included, as Victor Stinner does for CoroWrapper. On Thu, Apr 23, 2015 at 4:27 AM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Greg,
On 2015-04-22 7:47 PM, Greg Ewing wrote:
Yury Selivanov wrote:
On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me.
How is it any different from grammatically requiring parens in an ordinary function call? Nobody ever complained about that.
It is different.
1. Because 'await' keyword might be at a great distance from the object you're really calling:
await foo.bar.baz['spam']() +-----------------------+
Can I chain the calls:
await foo()() ?
or await foo().bar()?
2. Because there is no other keyword in python with similar behaviour.
3. Moreover: unless I can write 'await future' - your proposal *won't* work with a lot of existing code and patterns. It's going to be radically different from all other languages that implement 'await' too.
Yury
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

Greg Ewing <greg.ewing <at> canterbury.ac.nz> writes:
I still don't like the idea of hijacking the generic term "coroutine" and using it to mean this particular type of object.
There are only two hard things in Computer Science: cache invalidation and naming things. -- Phil Karlton :-) When reviewing Yury's PEP, I read Wikipedia's article of Coroutine because I didn't know if a "coroutine" is something new in Python nor if it was well defined. https://en.wikipedia.org/wiki/Coroutine Answer: it's not new, and it's implemented in many languages, and it's well defined. But coroutines are not always directly called "coroutines" in other programming languages. Using a custom name like "cofunction" may confuse users coming from other programming languages. I prefer to keep "coroutine", but I agree that we should make some effort to define the different categories of "Python coroutines". Well, there are two kind kinds of coroutines: (A) asyncio coroutine in Python 3.4: use yield from, yield denied, decorated with @asyncio.coroutine (B) PEP 492 coroutine in Python 3.5: use await, yield & yield from denied, function definition prefixed by "async" Yury proposed "generator-based coroutine for the kind (A). Maybe not a great name, since we can learn in the PEP 492 that the kind (B) is also (internally) based on generators. I don't think that we should use distinct names for the two kinds in common cases. But when we need to clearly use distinct names, I propose the following names: Kind (A): - "yield-from coroutines" or "coroutines based on yield-from" - maybe "asyncio coroutines" - "legacy coroutines"? Kind (B): - "awaitable coroutines" or "coroutines based on await" - "asynchronous coroutine" to remember the "async" keyword even if it sounds wrong to repeat that a coroutine can be interrupted (it's almost the definition of a coroutine, no?) - or just "asynchronous function" (coroutine function) & "asynchronous object" (coroutine object) Victor

2015-04-22 22:46 GMT+02:00 Victor Stinner <victor.stinner@gmail.com>:
Kind (A):
- "yield-from coroutines" or "coroutines based on yield-from" - maybe "asyncio coroutines" - "legacy coroutines"?
"legacy coroutines" name has the advantage to be directly clear it isn't a good idea to write new source code with that.
Kind (B):
- "awaitable coroutines" or "coroutines based on await" - "asynchronous coroutine" to remember the "async" keyword even if it sounds wrong to repeat that a coroutine can be interrupted (it's almost the definition of a coroutine, no?) - or just "asynchronous function" (coroutine function) & "asynchronous object" (coroutine object)
Personally, if I've a vote right, "async coroutine" is just enough, even if it's a repetition. Or just "coroutine" ? I'm not fan for "new-style coroutines" like name. By the way, I hope you don't change a third time how to write async code in Python, because it will be harder to define a new name. Not related, but one of my coworkers asked me if with the new syntax it will be possible to write an async decorator for coroutines. If I understand correctly new grammar in PEP, it seems to be yes, but could you confirm ?

Ludovic, On 2015-04-22 5:00 PM, Ludovic Gasc wrote:
Not related, but one of my coworkers asked me if with the new syntax it will be possible to write an async decorator for coroutines. If I understand correctly new grammar in PEP, it seems to be yes, but could you confirm ?
There shouldn't be any problems with writing a decorator. Yury

Ludovic Gasc wrote:
Not related, but one of my coworkers asked me if with the new syntax it will be possible to write an async decorator for coroutines.
This is certainly possible with PEP 3152. The decorator just needs to be an ordinary function whose return value is a cofunction. -- Greg

Victor Stinner wrote:
Using a custom name like "cofunction" may confuse users coming from other programming languages. I prefer to keep "coroutine", but I agree that we should make some effort to define the different categories of "Python coroutines".
I should perhaps point out that "cofunction" is not just an arbitrary word I made up to replace "coroutine". It is literally a kind of function, and is meant to be thought of that way. As for confusing new users, I would think that, as an unfamiliar word, it would point out that there is something they need to look up and learn about. Whereas they may think they already know what a "coroutine" is and not bother to look further. -- Greg

On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine.
I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference? Not only would this simplify the syntax, it would also allow dropping the need for `async` to be a true keyword, since functions could be defined via "def async foo():" rather than "async def foo():" ...which, incidentally, highlights one of the things that's been bothering me about all this "async foo" stuff: "async def" looks like it *defines the function* asynchronously (as with "async with" and "async for"), rather than defining an asynchronous function. ISTM it should be "def async bar():" or even "def bar() async:". Also, even that seems suspect to me: if `await` looks for an __await__ method and simply returns the same object (synchronously) if the object doesn't have an await method, then your code sample that supposedly will fail if a function ceases to be a coroutine *will not actually fail*. In my experience working with coroutine systems, making a system polymorphic (do something appropriate with what's given) and idempotent (don't do anything if what's wanted is already done) makes it more robust. In particular, it eliminates the issue of mixing coroutines and non-coroutines. To sum up: I can see the use case for a new `await` distinguished from `yield`, but I don't see the need to create new syntax for everything; ISTM that adding the new asynchronous protocols and using them on demand is sufficient. Marking a function asynchronous so it can use asynchronous iteration and context management seems reasonably useful, but I don't think it's terribly important for the type of function result. Indeed, ISTM that the built-in `object` class could just implement `__await__` as a no-op returning self, and then *all* results are trivially asynchronous results and can be awaited idempotently, so that awaiting something that has already been waited for is a no-op. (Prior art: the Javascript Promise.resolve() method, which takes either a promise or a plain value and returns a promise, so that you can write code which is always-async in the presence of values that may already be known.) Finally, if the async for and with operations have to be distinguished by syntax at the point of use (vs. just always being used in coroutines), then ISTM that they should be `with async foo:` and `for async x in bar:`, since the asynchronousness is just an aspect of how the main keyword is executed. tl;dr: I like the overall ideas but hate the syntax and type segregation involved: declaring a function async at the top is OK to enable async with/for semantics and await expressions, but the rest seems unnecessary and bad for writing robust code. (e.g. note that requiring different syntax means a function must either duplicate code or restrict its input types more, and type changes in remote parts of the program will propagate syntax changes throughout.)

On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby <pje@telecommunity.com> wrote:
On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine.
I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference?
IIRC Guido always like to have different syntax for calling regular functions and coroutines. That's why we need explicit syntax for asynchronous context managers and iterators.
Not only would this simplify the syntax, it would also allow dropping the need for `async` to be a true keyword, since functions could be defined via "def async foo():" rather than "async def foo():"
...which, incidentally, highlights one of the things that's been bothering me about all this "async foo" stuff: "async def" looks like it *defines the function* asynchronously (as with "async with" and "async for"), rather than defining an asynchronous function. ISTM it should be "def async bar():" or even "def bar() async:".
Also, even that seems suspect to me: if `await` looks for an __await__ method and simply returns the same object (synchronously) if the object doesn't have an await method, then your code sample that supposedly will fail if a function ceases to be a coroutine *will not actually fail*.
In my experience working with coroutine systems, making a system polymorphic (do something appropriate with what's given) and idempotent (don't do anything if what's wanted is already done) makes it more robust. In particular, it eliminates the issue of mixing coroutines and non-coroutines.
To sum up: I can see the use case for a new `await` distinguished from `yield`, but I don't see the need to create new syntax for everything; ISTM that adding the new asynchronous protocols and using them on demand is sufficient. Marking a function asynchronous so it can use asynchronous iteration and context management seems reasonably useful, but I don't think it's terribly important for the type of function result. Indeed, ISTM that the built-in `object` class could just implement `__await__` as a no-op returning self, and then *all* results are trivially asynchronous results and can be awaited idempotently, so that awaiting something that has already been waited for is a no-op. (Prior art: the Javascript Promise.resolve() method, which takes either a promise or a plain value and returns a promise, so that you can write code which is always-async in the presence of values that may already be known.)
Finally, if the async for and with operations have to be distinguished by syntax at the point of use (vs. just always being used in coroutines), then ISTM that they should be `with async foo:` and `for async x in bar:`, since the asynchronousness is just an aspect of how the main keyword is executed.
tl;dr: I like the overall ideas but hate the syntax and type segregation involved: declaring a function async at the top is OK to enable async with/for semantics and await expressions, but the rest seems unnecessary and bad for writing robust code. (e.g. note that requiring different syntax means a function must either duplicate code or restrict its input types more, and type changes in remote parts of the program will propagate syntax changes throughout.) _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

On Wed, Apr 22, 2015 at 1:10 PM, Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby <pje@telecommunity.com> wrote:
On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine.
I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference?
IIRC Guido always like to have different syntax for calling regular functions and coroutines. That's why we need explicit syntax for asynchronous context managers and iterators.
To clarify: the philosophy behind asyncio coroutines is that you should be able to tell statically where a task may be suspended simply by looking for `yield from`. This means that *no* implicit suspend points may exist, and it rules out gevent, stackless and similar microthreading frameworks. In the new PEP this would become `await`, plus specific points dictated by `async for` and `async with` -- `async for` can suspend (block) at each iteration step, and `async with` can suspend at the enter and exit points. The use case for both is database drivers: `async for` may block for the next record to become available from the query, and `async with` may block in the implied `finally` clause in order to wait for a commit. (Both may also suspend at the top, but that's less important.) -- --Guido van Rossum (python.org/~guido)

Hello, On Wed, 22 Apr 2015 13:31:18 -0700 Guido van Rossum <guido@python.org> wrote:
On Wed, Apr 22, 2015 at 1:10 PM, Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby <pje@telecommunity.com> wrote:
On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine.
I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference?
IIRC Guido always like to have different syntax for calling regular functions and coroutines. That's why we need explicit syntax for asynchronous context managers and iterators.
To clarify: the philosophy behind asyncio coroutines is that you should be able to tell statically where a task may be suspended simply by looking for `yield from`. This means that *no* implicit suspend points may exist, and it rules out gevent, stackless and similar microthreading frameworks.
I always wanted to ask - does that mean that Python could have symmetric coroutines (in a sense that it would be Pythonic feature), as long as the call syntax is different from a function call? E.g.: sym def coro1(val): while True: val = coro2.corocall(val) sym def coro2(val): while True: val = coro1.corocall(val) coro1.call(1) -- Best regards, Paul mailto:pmiscml@gmail.com

On 04/22, PJ Eby wrote:
tl;dr: I like the overall ideas but hate the syntax and type segregation involved: declaring a function async at the top is OK to enable async with/for semantics and await expressions, but the rest seems unnecessary and bad for writing robust code. (e.g. note that requiring different syntax means a function must either duplicate code or restrict its input types more, and type changes in remote parts of the program will propagate syntax changes throughout.)
Agreed. -- ~Ethan~

Hi PJ, On 2015-04-22 3:44 PM, PJ Eby wrote:
On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine. I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference?
One of the things that we try to avoid is to have implicit places where code execution might be suspended. For that we use 'yield from' right now, and want to use 'await' with PEP 492. To have implicit context switches there is Stackless Python and greenlets, however, it's harder to reason about the code written in such a way. Having explicit 'yield from/await' is the selling point of asyncio and other frameworks that use generator-based coroutines. Hence, we want to stress that 'async with' and 'async for' do suspend the execution in their protocols. I don't want to loose control over what kind of iteration or context manager I'm using. I don't want to iterate through a cursor that doesn't do prefetching, I want to make sure that it does. This problem is solved by the PEP.
Not only would this simplify the syntax, it would also allow dropping the need for `async` to be a true keyword, since functions could be defined via "def async foo():" rather than "async def foo():"
...which, incidentally, highlights one of the things that's been bothering me about all this "async foo" stuff: "async def" looks like it *defines the function* asynchronously (as with "async with" and "async for"), rather than defining an asynchronous function. ISTM it should be "def async bar():" or even "def bar() async:".
If we keep 'async with', then we'll have to keep 'async def' to make it symmetric and easier to remember. But, in theory, I'd be OK with 'def async'. 'def name() async' is something that will be extremely hard to notice in the code.
Also, even that seems suspect to me: if `await` looks for an __await__ method and simply returns the same object (synchronously) if the object doesn't have an await method, then your code sample that supposedly will fail if a function ceases to be a coroutine *will not actually fail*.
It doesn't just do that. In the reference implementation, a single 'await o' compiles to: (o) # await arg on top of the stack GET_AWAITABLE LOAD_CONST None YIELD_FROM Where GET_AWAITABLE does the following: - If it's a coroutine-object -- return it - If it's an object with __await__, return iter(object.__await__()) - Raise a TypeError of two above steps don't return If you had a code like that: await coro() where coro is async def coro(): pass you then can certainly refactor core to: def coro(): return future # or some awaitable, please refer to PEP492 And it won't break anything. So I'm not sure I understand your remark about "*will not actually fail*".
In my experience working with coroutine systems, making a system polymorphic (do something appropriate with what's given) and idempotent (don't do anything if what's wanted is already done) makes it more robust. In particular, it eliminates the issue of mixing coroutines and non-coroutines.
Unfortunately, to completely eliminate the issue of reusing existing "non-coroutine" code, or of writing "coroutine" code that can be used with "non-coroutine" code, you have to use gevent-kind of libraries.
To sum up: I can see the use case for a new `await` distinguished from `yield`, but I don't see the need to create new syntax for everything; ISTM that adding the new asynchronous protocols and using them on demand is sufficient. Marking a function asynchronous so it can use asynchronous iteration and context management seems reasonably useful, but I don't think it's terribly important for the type of function result. Indeed, ISTM that the built-in `object` class could just implement `__await__` as a no-op returning self, and then *all* results are trivially asynchronous results and can be awaited idempotently, so that awaiting something that has already been waited for is a no-op.
I see all objects implementing __await__ returning "self" as a very error prone approach. It's totally OK to write code like that: async def coro(): return fut future = await coro() In the above example, if coro ceases to be a coroutine, 'future' will be a result of 'fut', not 'fut' itself.
(Prior art: the Javascript Promise.resolve() method, which takes either a promise or a plain value and returns a promise, so that you can write code which is always-async in the presence of values that may already be known.)
Finally, if the async for and with operations have to be distinguished by syntax at the point of use (vs. just always being used in coroutines), then ISTM that they should be `with async foo:` and `for async x in bar:`, since the asynchronousness is just an aspect of how the main keyword is executed.
tl;dr: I like the overall ideas but hate the syntax and type segregation involved: declaring a function async at the top is OK to enable async with/for semantics and await expressions, but the rest seems unnecessary and bad for writing robust code. (e.g. note that requiring different syntax means a function must either duplicate code or restrict its input types more, and type changes in remote parts of the program will propagate syntax changes throughout.)
Thanks, Yury

On 2015-04-23 3:03 AM, Greg Ewing wrote:
Yury Selivanov wrote:
- If it's an object with __await__, return iter(object.__await__())
Is the iter() really needed? Couldn't the contract of __await__ be that it always returns an iterator?
I wrote it the wrong way. iter() isn't needed, you're right. This is a quote from the ref implementation: if (!PyIter_Check(await_obj)) { PyErr_Format(PyExc_TypeError, "__await__ must return an iterator, %.100s", Py_TYPE(await_obj)->tp_name); Yury

PJ Eby wrote:
I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference?
It depends on whether you think it's important to have a syntactic marker for points where the code can potentially be suspended. In my original vision for PEP 3152, there was no "cocall" syntax -- you just wrote an ordinary call, and whether to make a cocall or not was determined at run time. But Guido and others felt that it would be better for suspension points to be explicit, so I ended up with cocall. The same reasoning presumably applies to asynchronous 'for' and 'with'. If you think that it's important to make suspendable calls explicit, you probably want to mark them as well.
...which, incidentally, highlights one of the things that's been bothering me about all this "async foo" stuff: "async def" looks like it *defines the function* asynchronously
That bothers me a bit, too, but my main problem with it is the way it displaces the function name. "def f() async:" would solve both of those problems. -- Greg

Hi, most of the time I am a silent reader but in this discussion I must step in. I use twisted and async stuff a lot over years followed development of asyncio closely. First it is good to do differentiate async coroutines from generators. So everyone can see it and have this in mind and don't mix up booth. It is also easier to explain for new users. Sometimes generators blows their mind and it takes a while to get used to them. Async stuff is even harder. 1. I am fine with using something special instead of "yield" or "yield from" for this. C# "await" is ok. Everything else suggested complicates the language and makes it harder to read. 2. async def f(): is harder to read and something special also it breaks the symmetry in front (def indent). Also every existing tooling must be changed to support it. Same for def async, def f() async: I thing a decorator is enough here @coroutine def f(): is the best solution to mark something as a coroutine. 3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years. Same for with statement. The main use case suggested was for database stuff and this is also where most are best with defer something to a thread and keep it none async. All together it is very late in the development cycle for 3.5 to incorporate such a big change. Best is to give all this some more time and defer it to 3.6 and some alpha releases to experiment with. Regards, Wolfgang On Tue, Apr 21, 2015 at 7:26 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi python-dev,
I'm moving the discussion from python-ideas to here.
The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email.
Updates:
1. CO_ASYNC flag was renamed to CO_COROUTINE;
2. sys.set_async_wrapper() was renamed to sys.set_coroutine_wrapper();
3. New function: sys.get_coroutine_wrapper();
4. types.async_def() renamed to types.coroutine();
5. New section highlighting differences from PEP 3152.
6. New AST node - AsyncFunctionDef; the proposal now is 100% backwards compatible;
7. A new section clarifying that coroutine-generators are not part of the current proposal;
8. Various small edits/typos fixes.
There's is a bug tracker issue to track code review of the reference implementation (Victor Stinner is doing the review): http://bugs.python.org/issue24017 While the PEP isn't accepted, we want to make sure that the reference implementation is ready when such a decision will be made.
Let's discuss some open questions:
1. Victor raised a question if we should locate coroutine() function from 'types' module to 'functools'.
My opinion is that 'types' module is a better place for 'corotine()', since it adjusts the type of the passed generator. 'functools' is about 'partials', 'lru_cache' and 'wraps' kind of things.
2. I propose to disallow using of 'for..in' loops, and builtins like 'list()', 'iter()', 'next()', 'tuple()' etc on coroutines.
It's possible by modifying PyObject_GetIter to raise an exception if it receives a coroutine-object.
'yield from' can also be modified to only accept coroutine objects if it is called from a generator with CO_COROUTINE flag.
This will further separate coroutines from generators, making it harder to screw something up by an accident.
I have a branch of reference implementation https://github.com/1st1/cpython/tree/await_noiter where this is implemented. I did not observe any performance drop.
There is just one possible backwards compatibility issue here: there will be an exception if some user of asyncio actually used to iterate over generators decorated with @coroutine. But I can't imagine why would someone do that, and even if they did -- it's probably a bug or wrong usage of asyncio.
That's it! I'd be happy to hear some feedback!
Thanks, Yury
PEP: 492 Title: Coroutines with async and await syntax Version: $Revision$ Last-Modified: $Date$ Author: Yury Selivanov <yselivanov@sprymix.com> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Apr-2015 Python-Version: 3.5 Post-History: 17-Apr-2015, 21-Apr-2015
Abstract ========
This PEP introduces new syntax for coroutines, asynchronous ``with`` statements and ``for`` loops. The main motivation behind this proposal is to streamline writing and maintaining asynchronous code, as well as to simplify previously hard to implement code patterns.
Rationale and Goals ===================
Current Python supports implementing coroutines via generators (PEP 342), further enhanced by the ``yield from`` syntax introduced in PEP 380. This approach has a number of shortcomings:
* it is easy to confuse coroutines with regular generators, since they share the same syntax; async libraries often attempt to alleviate this by using decorators (e.g. ``@asyncio.coroutine`` [1]_);
* it is not possible to natively define a coroutine which has no ``yield`` or ``yield from`` statements, again requiring the use of decorators to fix potential refactoring issues;
* support for asynchronous calls is limited to expressions where ``yield`` is allowed syntactically, limiting the usefulness of syntactic features, such as ``with`` and ``for`` statements.
This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring.
Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new ``async with`` statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new ``async for`` statement makes it possible to perform asynchronous calls in iterators.
Specification =============
This proposal introduces new syntax and semantics to enhance coroutine support in Python, it does not change the internal implementation of coroutines, which are still based on generators.
It is strongly suggested that the reader understands how coroutines are implemented in Python (PEP 342 and PEP 380). It is also recommended to read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions).
From this point in this document we use the word *coroutine* to refer to functions declared using the new syntax. *generator-based coroutine* is used where necessary to refer to coroutines that are based on generator syntax.
New Coroutine Declaration Syntax --------------------------------
The following new syntax is used to declare a coroutine::
async def read_data(db): pass
Key properties of coroutines:
* ``async def`` functions are always coroutines, even if they do not contain ``await`` expressions.
* It is a ``SyntaxError`` to have ``yield`` or ``yield from`` expressions in an ``async`` function.
* Internally, a new code object flag - ``CO_COROUTINE`` - is introduced to enable runtime detection of coroutines (and migrating existing code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` flags set.
* Regular generators, when called, return a *generator object*; similarly, coroutines return a *coroutine object*.
* ``StopIteration`` exceptions are not propagated out of coroutines, and are replaced with a ``RuntimeError``. For regular generators such behavior requires a future import (see PEP 479).
types.coroutine() -----------------
A new function ``coroutine(gen)`` is added to the ``types`` module. It applies ``CO_COROUTINE`` flag to the passed generator-function's code object, making it to return a *coroutine object* when called.
This feature enables an easy upgrade path for existing libraries.
Await Expression ----------------
The following new ``await`` expression is used to obtain a result of coroutine execution::
async def read_data(db): data = await db.fetch('SELECT ...') ...
``await``, similarly to ``yield from``, suspends execution of ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and returns the result data.
It uses the ``yield from`` implementation with an extra step of validating its argument. ``await`` only accepts an *awaitable*, which can be one of:
* A *coroutine object* returned from a *coroutine* or a generator decorated with ``types.coroutine()``.
* An object with an ``__await__`` method returning an iterator.
Any ``yield from`` chain of calls ends with a ``yield``. This is a fundamental mechanism of how *Futures* are implemented. Since, internally, coroutines are a special kind of generators, every ``await`` is suspended by a ``yield`` somewhere down the chain of ``await`` calls (please refer to PEP 3156 for a detailed explanation.)
To enable this behavior for coroutines, a new magic method called ``__await__`` is added. In asyncio, for instance, to enable Future objects in ``await`` statements, the only change is to add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
Objects with ``__await__`` method are called *Future-like* objects in the rest of this PEP.
Also, please note that ``__aiter__`` method (see its definition below) cannot be used for this purpose. It is a different protocol, and would be like using ``__iter__`` instead of ``__call__`` for regular callables.
It is a ``SyntaxError`` to use ``await`` outside of a coroutine.
Asynchronous Context Managers and "async with" ----------------------------------------------
An *asynchronous context manager* is a context manager that is able to suspend execution in its *enter* and *exit* methods.
To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: ``__aenter__`` and ``__aexit__``. Both must return an *awaitable*.
An example of an asynchronous context manager::
class AsyncContextManager: async def __aenter__(self): await log('entering context')
async def __aexit__(self, exc_type, exc, tb): await log('exiting context')
New Syntax ''''''''''
A new statement for asynchronous context managers is proposed::
async with EXPR as VAR: BLOCK
which is semantically equivalent to::
mgr = (EXPR) aexit = type(mgr).__aexit__ aenter = type(mgr).__aenter__(mgr) exc = True
try: try: VAR = await aenter BLOCK except: exc = False exit_res = await aexit(mgr, *sys.exc_info()) if not exit_res: raise
finally: if exc: await aexit(mgr, None, None, None)
As with regular ``with`` statements, it is possible to specify multiple context managers in a single ``async with`` statement.
It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine.
Example '''''''
With asynchronous context managers it is easy to implement proper database transaction managers for coroutines::
async def commit(session, data): ...
async with session.transaction(): ... await session.update(data) ...
Code that needs locking also looks lighter::
async with lock: ...
instead of::
with (yield from lock): ...
Asynchronous Iterators and "async for" --------------------------------------
An *asynchronous iterable* is able to call asynchronous code in its *iter* implementation, and *asynchronous iterator* can call asynchronous code in its *next* method. To support asynchronous iteration:
1. An object must implement an ``__aiter__`` method returning an *awaitable* resulting in an *asynchronous iterator object*.
2. An *asynchronous iterator object* must implement an ``__anext__`` method returning an *awaitable*.
3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` exception.
An example of asynchronous iterable::
class AsyncIterable: async def __aiter__(self): return self
async def __anext__(self): data = await self.fetch_data() if data: return data else: raise StopAsyncIteration
async def fetch_data(self): ...
New Syntax ''''''''''
A new statement for iterating through asynchronous iterators is proposed::
async for TARGET in ITER: BLOCK else: BLOCK2
which is semantically equivalent to::
iter = (ITER) iter = await type(iter).__aiter__(iter) running = True while running: try: TARGET = await type(iter).__anext__(iter) except StopAsyncIteration: running = False else: BLOCK else: BLOCK2
It is an error to pass a regular iterable without ``__aiter__`` method to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside of a coroutine.
As for with regular ``for`` statement, ``async for`` has an optional ``else`` clause.
Example 1 '''''''''
With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration::
async for data in cursor: ...
Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows of data from a database after every ``N`` iterations.
The following code illustrates new asynchronous iteration protocol::
class Cursor: def __init__(self): self.buffer = collections.deque()
def _prefetch(self): ...
async def __aiter__(self): return self
async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopAsyncIteration return self.buffer.popleft()
then the ``Cursor`` class can be used as follows::
async for row in Cursor(): print(row)
which would be equivalent to the following code::
i = await Cursor().__aiter__() while True: try: row = await i.__anext__() except StopAsyncIteration: break else: print(row)
Example 2 '''''''''
The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators.
::
class AsyncIteratorWrapper: def __init__(self, obj): self._it = iter(obj)
async def __aiter__(self): return self
async def __anext__(self): try: value = next(self._it) except StopIteration: raise StopAsyncIteration return value
async for item in AsyncIteratorWrapper("abc"): print(item)
Why StopAsyncIteration? '''''''''''''''''''''''
Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between
::
def g1(): yield from fut return 'spam'
and
::
def g2(): yield from fut raise StopIteration('spam')
And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its ``StopIteration`` wrapped into a ``RuntimeError``
::
async def a1(): await fut raise StopIteration('spam')
The only way to tell the outside code that the iteration has ended is to raise something other than ``StopIteration``. Therefore, a new built-in exception class ``StopAsyncIteration`` was added.
Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions raised in coroutines are wrapped in ``RuntimeError``.
Debugging Features ------------------
One of the most frequent mistakes that people make when using generators as coroutines is forgetting to use ``yield from``::
@asyncio.coroutine def useful(): asyncio.sleep(1) # this will do noting without 'yield from'
For debugging this kind of mistakes there is a special debug mode in asyncio, in which ``@coroutine`` decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient ``__repr__`` function with detailed information about the generator.
The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, ``@coroutine`` decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is possible to run asyncio programs with asyncio's own functions instrumented. ``EventLoop.set_debug``, a different debug facility, has no impact on ``@coroutine`` decorator's behavior.
With this proposal, coroutines is a native, distinct from generators, concept. New methods ``set_coroutine_wrapper`` and ``get_coroutine_wrapper`` are added to the ``sys`` module, with which frameworks can provide advanced debugging facilities.
It is also important to make coroutines as fast and efficient as possible, therefore there are no debug features enabled by default.
Example::
async def debug_me(): await asyncio.sleep(1)
def async_debug_wrap(generator): return asyncio.AsyncDebugWrapper(generator)
sys.set_coroutine_wrapper(async_debug_wrap)
debug_me() # <- this line will likely GC the coroutine object and # trigger AsyncDebugWrapper's code.
assert isinstance(debug_me(), AsyncDebugWrapper)
sys.set_coroutine_wrapper(None) # <- this unsets any # previously set wrapper assert not isinstance(debug_me(), AsyncDebugWrapper)
If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` unsets the wrapper.
Glossary ========
:Coroutine: A coroutine function, or just "coroutine", is declared with ``async def``. It uses ``await`` and ``return value``; see `New Coroutine Declaration Syntax`_ for details.
:Coroutine object: Returned from a coroutine function. See `Await Expression`_ for details.
:Future-like object: An object with an ``__await__`` method. Can be consumed by an ``await`` expression in a coroutine. A coroutine waiting for a Future-like object is suspended until the Future-like object's ``__await__`` completes, and returns the result. See `Await Expression`_ for details.
:Awaitable: A *Future-like* object or a *coroutine object*. See `Await Expression`_ for details.
:Generator-based coroutine: Coroutines based in generator syntax. Most common example is ``@asyncio.coroutine``.
:Asynchronous context manager: An asynchronous context manager has ``__aenter__`` and ``__aexit__`` methods and can be used with ``async with``. See `Asynchronous Context Managers and "async with"`_ for details.
:Asynchronous iterable: An object with an ``__aiter__`` method, which must return an *asynchronous iterator* object. Can be used with ``async for``. See `Asynchronous Iterators and "async for"`_ for details.
:Asynchronous iterator: An asynchronous iterator has an ``__anext__`` method. See `Asynchronous Iterators and "async for"`_ for details.
List of functions and methods =============================
================= =================================== ================= Method Can contain Can't contain ================= =================================== ================= async def func await, return value yield, yield from async def __a*__ await, return value yield, yield from def __a*__ return awaitable await def __await__ yield, yield from, return iterable await generator yield, yield from, return value await ================= =================================== =================
Where:
* "async def func": coroutine;
* "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined with the ``async`` keyword;
* "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined without the ``async`` keyword, must return an *awaitable*;
* "def __await__": ``__await__`` method to implement *Future-like* objects;
* generator: a "regular" generator, function defined with ``def`` and which contains a least one ``yield`` or ``yield from`` expression.
Transition Plan ===============
To avoid backwards compatibility issues with ``async`` and ``await`` keywords, it was decided to modify ``tokenizer.c`` in such a way, that it:
* recognizes ``async def`` name tokens combination (start of a coroutine);
* keeps track of regular functions and coroutines;
* replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with ``AWAIT`` when in the process of yielding tokens for coroutines.
This approach allows for seamless combination of new syntax features (all of them available only in ``async`` functions) with any existing code.
An example of having "async def" and "async" attribute in one piece of code::
class Spam: async = 42
async def ham(): print(getattr(Spam, 'async'))
# The coroutine can be executed and will print '42'
Backwards Compatibility -----------------------
This proposal preserves 100% backwards compatibility.
Grammar Updates ---------------
Grammar changes are also fairly minimal::
await_expr: AWAIT test await_stmt: await_expr
decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: ASYNC funcdef
async_stmt: ASYNC (funcdef | with_stmt | for_stmt)
compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt)
atom: ('(' [yield_expr|await_expr|testlist_comp] ')' | '[' [testlist_comp] ']' | '{' [dictorsetmaker] '}' | NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False’)
expr_stmt: testlist_star_expr (augassign (yield_expr|await_expr|testlist) | ('=' (yield_expr|await_expr|testlist_star_expr))*)
Transition Period Shortcomings ------------------------------
There is just one.
Until ``async`` and ``await`` are not proper keywords, it is not possible (or at least very hard) to fix ``tokenizer.c`` to recognize them on the **same line** with ``def`` keyword::
# async and await will always be parsed as variables
async def outer(): # 1 def nested(a=(await fut)): pass
async def foo(): return (await fut) # 2
Since ``await`` and ``async`` in such cases are parsed as ``NAME`` tokens, a ``SyntaxError`` will be raised.
To workaround these issues, the above examples can be easily rewritten to a more readable form::
async def outer(): # 1 a_default = await fut def nested(a=a_default): pass
async def foo(): # 2 return (await fut)
This limitation will go away as soon as ``async`` and ``await`` ate proper keywords. Or if it's decided to use a future import for this PEP.
Deprecation Plans -----------------
``async`` and ``await`` names will be softly deprecated in CPython 3.5 and 3.6. In 3.7 we will transform them to proper keywords. Making ``async`` and ``await`` proper keywords before 3.7 might make it harder for people to port their code to Python 3.
asyncio -------
``asyncio`` module was adapted and tested to work with coroutines and new statements. Backwards compatibility is 100% preserved.
The required changes are mainly:
1. Modify ``@asyncio.coroutine`` decorator to use new ``types.coroutine()`` function.
2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
3. Add ``ensure_task()`` as an alias for ``async()`` function. Deprecate ``async()`` function.
Design Considerations =====================
PEP 3152 --------
PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called "cofunctions"). Some key points:
1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is always a generator, even if there is no ``cocall`` expressions inside it. Maps to ``async def`` in this proposal.
2. A new keyword ``cocall`` to call a *cofunction*. Can only be used inside a *cofunction*. Maps to ``await`` in this proposal (with some differences, see below.)
3. It is not possible to call a *cofunction* without a ``cocall`` keyword.
4. ``cocall`` grammatically requires parentheses after it::
atom: cocall | <existing alternatives for atom> cocall: 'cocall' atom cotrailer* '(' [arglist] ')' cotrailer: '[' subscriptlist ']' | '.' NAME
5. ``cocall f(*args, **kwds)`` is semantically equivalent to ``yield from f.__cocall__(*args, **kwds)``.
Differences from this proposal:
1. There is no equivalent of ``__cocall__`` in this PEP, which is called and its result is passed to ``yield from`` in the ``cocall`` expression. ``await`` keyword expects an *awaitable* object, validates the type, and executes ``yield from`` on it. Although, ``__await__`` method is similar to ``__cocall__``, but is only used to define *Future-like* objects.
2. ``await`` is defined in almost the same way as ``yield from`` in the grammar (it is later enforced that ``await`` can only be inside ``async def``). It is possible to simply write ``await future``, whereas ``cocall`` always requires parentheses.
3. To make asyncio work with PEP 3152 it would be required to modify ``@asyncio.coroutine`` decorator to wrap all functions in an object with a ``__cocall__`` method. To call *cofunctions* from existing generator-based coroutines it would be required to use ``costart`` built-in. In this proposal ``@asyncio.coroutine`` simply sets ``CO_COROUTINE`` on the wrapped function's code object and everything works automatically.
4. Since it is impossible to call a *cofunction* without a ``cocall`` keyword, it automatically prevents the common mistake of forgetting to use ``yield from`` on generator-based coroutines. This proposal addresses this problem with a different approach, see `Debugging Features`_.
5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine is that if is decided to implement coroutine-generators -- coroutines with ``yield`` or ``async yield`` expressions -- we wouldn't need a ``cocall`` keyword to call them. So we'll end up having ``__cocall__`` and no ``__call__`` for regular coroutines, and having ``__call__`` and no ``__cocall__`` for coroutine- generators.
6. There are no equivalents of ``async for`` and ``async with`` in PEP 3152.
Coroutine-generators --------------------
With ``async for`` keyword it is desirable to have a concept of a *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` expressions. To avoid any ambiguity with regular generators, we would likely require to have an ``async`` keyword before ``yield``, and ``async yield from`` would raise a ``StopAsyncIteration`` exception.
While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP.
No implicit wrapping in Futures -------------------------------
There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A key difference is that JavaScript "async functions" always return a Promise. While this approach has some advantages, it also implies that a new Promise object is created on each "async function" invocation.
We could implement a similar functionality in Python, by wrapping all coroutines in a Future object, but this has the following disadvantages:
1. Performance. A new Future object would be instantiated on each coroutine call. Moreover, this makes implementation of ``await`` expressions slower (disabling optimizations of ``yield from``).
2. A new built-in ``Future`` object would need to be added.
3. Coming up with a generic ``Future`` interface that is usable for any use case in any framework is a very hard to solve problem.
4. It is not a feature that is used frequently, when most of the code is coroutines.
Why "async" and "await" keywords --------------------------------
async/await is not a new concept in programming languages:
* C# has it since long time ago [5]_;
* proposal to add async/await in ECMAScript 7 [2]_; see also Traceur project [9]_;
* Facebook's Hack/HHVM [6]_;
* Google's Dart language [7]_;
* Scala [8]_;
* proposal to add async/await to C++ [10]_;
* and many other less popular languages.
This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance).
Why "__aiter__" is a coroutine ------------------------------
In principle, ``__aiter__`` could be a regular function. There are several good reasons to make it a coroutine:
* as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` methods are coroutines, users would often make a mistake defining it as ``async`` anyways;
* there might be a need to run some asynchronous operations in ``__aiter__``, for instance to prepare DB queries or do some file operation.
Importance of "async" keyword -----------------------------
While it is possible to just implement ``await`` expression and treat all functions with at least one ``await`` as coroutines, this approach makes APIs design, code refactoring and its long time support harder.
Let's pretend that Python only has ``await`` keyword::
def useful(): ... await log(...) ...
def important(): await useful()
If ``useful()`` function is refactored and someone removes all ``await`` expressions from it, it would become a regular python function, and all code that depends on it, including ``important()`` would be broken. To mitigate this issue a decorator similar to ``@asyncio.coroutine`` has to be introduced.
Why "async def" ---------------
For some people bare ``async name(): pass`` syntax might look more appealing than ``async def name(): pass``. It is certainly easier to type. But on the other hand, it breaks the symmetry between ``async def``, ``async with`` and ``async for``, where ``async`` is a modifier, stating that the statement is asynchronous. It is also more consistent with the existing grammar.
Why not a __future__ import ---------------------------
``__future__`` imports are inconvenient and easy to forget to add. Also, they are enabled for the whole source file. Consider that there is a big project with a popular module named "async.py". With future imports it is required to either import it using ``__import__()`` or ``importlib.import_module()`` calls, or to rename the module. The proposed approach makes it possible to continue using old code and modules without a hassle, while coming up with a migration plan for future python versions.
Why magic methods start with "a" --------------------------------
New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". An alternative proposal is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``. However, to align new magic methods with the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided to use a shorter version.
Why not reuse existing magic names ----------------------------------
An alternative idea about new asynchronous iterators and context managers was to reuse existing magic methods, by adding an ``async`` keyword to their declarations::
class CM: async def __enter__(self): # instead of __aenter__ ...
This approach has the following downsides:
* it would not be possible to create an object that works in both ``with`` and ``async with`` statements;
* it would look confusing and would require some implicit magic behind the scenes in the interpreter;
* one of the main points of this proposal is to make coroutines as simple and foolproof as possible.
Comprehensions --------------
For the sake of restricting the broadness of this PEP there is no new syntax for asynchronous comprehensions. This should be considered in a separate PEP, if there is a strong demand for this feature.
Async lambdas -------------
Lambda coroutines are not part of this proposal. In this proposal they would look like ``async lambda(parameters): expression``. Unless there is a strong demand to have them as part of this proposal, it is recommended to consider them later in a separate PEP.
Performance ===========
Overall Impact --------------
This proposal introduces no observable performance impact. Here is an output of python's official set of benchmarks [4]_:
::
python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe
[skipped]
Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64 i386
Total CPU cores: 8
### etree_iterparse ### Min: 0.365359 -> 0.349168: 1.05x faster Avg: 0.396924 -> 0.379735: 1.05x faster Significant (t=9.71) Stddev: 0.01225 -> 0.01277: 1.0423x larger
The following not significant results are hidden, use -v to show them: django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle, fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.
Tokenizer modifications -----------------------
There is no observable slowdown of parsing python files with the modified tokenizer: parsing of one 12Mb file (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount of time.
async/await -----------
The following micro-benchmark was used to determine performance difference between "async" functions and generators::
import sys import time
def binary(n): if n <= 0: return 1 l = yield from binary(n - 1) r = yield from binary(n - 1) return l + 1 + r
async def abinary(n): if n <= 0: return 1 l = await abinary(n - 1) r = await abinary(n - 1) return l + 1 + r
def timeit(gen, depth, repeat): t0 = time.time() for _ in range(repeat): list(gen(depth)) t1 = time.time() print('{}({}) * {}: total {:.3f}s'.format( gen.__name__, depth, repeat, t1-t0))
The result is that there is no observable performance difference. Minimum timing of 3 runs
::
abinary(19) * 30: total 12.985s binary(19) * 30: total 12.953s
Note that depth of 19 means 1,048,575 calls.
Reference Implementation ========================
The reference implementation can be found here: [3]_.
List of high-level changes and new protocols --------------------------------------------
1. New syntax for defining coroutines: ``async def`` and new ``await`` keyword.
2. New ``__await__`` method for Future-like objects.
3. New syntax for asynchronous context managers: ``async with``. And associated protocol with ``__aenter__`` and ``__aexit__`` methods.
4. New syntax for asynchronous iteration: ``async for``. And associated protocol with ``__aiter__``, ``__aexit__`` and new built- in exception ``StopAsyncIteration``.
5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, ``Await``.
6. New functions: ``sys.set_coroutine_wrapper(callback)``, ``sys.get_coroutine_wrapper()``, and ``types.coroutine(gen)``.
7. New ``CO_COROUTINE`` bit flag for code objects.
While the list of changes and new things is not short, it is important to understand, that most users will not use these features directly. It is intended to be used in frameworks and libraries to provide users with convenient to use and unambiguous APIs with ``async def``, ``await``, ``async for`` and ``async with`` syntax.
Working example ---------------
All concepts proposed in this PEP are implemented [3]_ and can be tested.
::
import asyncio
async def echo_server(): print('Serving on localhost:8000') await asyncio.start_server(handle_connection, 'localhost', 8000)
async def handle_connection(reader, writer): print('New connection...')
while True: data = await reader.read(8192)
if not data: break
print('Sending {:.10}... back'.format(repr(data))) writer.write(data)
loop = asyncio.get_event_loop() loop.run_until_complete(echo_server()) try: loop.run_forever() finally: loop.close()
References ==========
.. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
.. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions
.. [3] https://github.com/1st1/cpython/tree/await
.. [4] https://hg.python.org/benchmarks
.. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx
.. [6] http://docs.hhvm.com/manual/en/hack.async.php
.. [7] https://www.dartlang.org/articles/await-async/
.. [8] http://docs.scala-lang.org/sips/pending/async.html
.. [9] https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-funct...
.. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)
Acknowledgments ===============
I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, and Łukasz Langa for their initial feedback.
Copyright =========
This document has been placed in the public domain.
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/tds333%2Bpydev%40gmail.co...
-- bye by Wolfgang

Hi, I agree with most of Wolfgang's points below. As a data point, I haven't used asyncio for anything real (despite having approved the PEP!), but I have some extensive prior experience with Twisted and Tornado :-) Regards Antoine. On Thu, 23 Apr 2015 09:30:30 +0200 Wolfgang Langner <tds333+pydev@gmail.com> wrote:
Hi,
most of the time I am a silent reader but in this discussion I must step in. I use twisted and async stuff a lot over years followed development of asyncio closely.
First it is good to do differentiate async coroutines from generators. So everyone can see it and have this in mind and don't mix up booth. It is also easier to explain for new users. Sometimes generators blows their mind and it takes a while to get used to them. Async stuff is even harder.
1. I am fine with using something special instead of "yield" or "yield from" for this. C# "await" is ok.
Everything else suggested complicates the language and makes it harder to read.
2. async def f(): is harder to read and something special also it breaks the symmetry in front (def indent). Also every existing tooling must be changed to support it. Same for def async, def f() async: I thing a decorator is enough here @coroutine def f(): is the best solution to mark something as a coroutine.
3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years. Same for with statement.
The main use case suggested was for database stuff and this is also where most are best with defer something to a thread and keep it none async.
All together it is very late in the development cycle for 3.5 to incorporate such a big change. Best is to give all this some more time and defer it to 3.6 and some alpha releases to experiment with.
Regards,
Wolfgang

On Thu, Apr 23, 2015 at 10:30 AM, Wolfgang Langner <tds333+pydev@gmail.com> wrote:
Hi,
most of the time I am a silent reader but in this discussion I must step in. I use twisted and async stuff a lot over years followed development of asyncio closely.
First it is good to do differentiate async coroutines from generators. So everyone can see it and have this in mind and don't mix up booth. It is also easier to explain for new users. Sometimes generators blows their mind and it takes a while to get used to them. Async stuff is even harder.
1. I am fine with using something special instead of "yield" or "yield from" for this. C# "await" is ok.
Everything else suggested complicates the language and makes it harder to read.
2. async def f(): is harder to read and something special also it breaks the symmetry in front (def indent). Also every existing tooling must be changed to support it. Same for def async, def f() async: I thing a decorator is enough here @coroutine def f(): is the best solution to mark something as a coroutine.
Sorry, `@coroutine` decorator is part of Python semantics, not Python syntax. That means Python compiler cannot relay on @coroutine in parsing. `away`, `async for` and `async with` are available only inside `async def`, not inside regular `def`.
3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass is not equal for for fut in iterable: i = yield from fut async for is also suitable when size of iterable sequence is unknown on iteration start. Let's look, for example, on Redis SCAN command (http://redis.io/commands/scan) for iterating over redis keys. It returns bulk of keys and opaque value for next SCAN call. In current Python it must be written as cur = 0 while True: bulk, cur = yield from redis.scan(cur) for key in bulk: process_key(key) if cur == 0: break With new syntax iteration looks much more native: async for key in redis.scan(cur): process_key(key)
Same for with statement.
The main use case suggested was for database stuff and this is also where most are best with defer something to a thread and keep it none async.
`async with` is not only for transactions in relation databases. As well as plain `with` is used not only for databases but for many other things -- from files to decimal contexts. As realistic example not related to databases please recall RabbitMQ. Every message processing may be acknowledged. The native API for that may be while True: async with channel.consume("my queue") as msg: process_msg(msg) with acknowledgement on successful message processing only. Acknowledgement requires network communication and must be asynchronous operation.
All together it is very late in the development cycle for 3.5 to incorporate such a big change. Best is to give all this some more time and defer it to 3.6 and some alpha releases to experiment with.
Regards,
Wolfgang
On Tue, Apr 21, 2015 at 7:26 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi python-dev,
I'm moving the discussion from python-ideas to here.
The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email.
Updates:
1. CO_ASYNC flag was renamed to CO_COROUTINE;
2. sys.set_async_wrapper() was renamed to sys.set_coroutine_wrapper();
3. New function: sys.get_coroutine_wrapper();
4. types.async_def() renamed to types.coroutine();
5. New section highlighting differences from PEP 3152.
6. New AST node - AsyncFunctionDef; the proposal now is 100% backwards compatible;
7. A new section clarifying that coroutine-generators are not part of the current proposal;
8. Various small edits/typos fixes.
There's is a bug tracker issue to track code review of the reference implementation (Victor Stinner is doing the review): http://bugs.python.org/issue24017 While the PEP isn't accepted, we want to make sure that the reference implementation is ready when such a decision will be made.
Let's discuss some open questions:
1. Victor raised a question if we should locate coroutine() function from 'types' module to 'functools'.
My opinion is that 'types' module is a better place for 'corotine()', since it adjusts the type of the passed generator. 'functools' is about 'partials', 'lru_cache' and 'wraps' kind of things.
2. I propose to disallow using of 'for..in' loops, and builtins like 'list()', 'iter()', 'next()', 'tuple()' etc on coroutines.
It's possible by modifying PyObject_GetIter to raise an exception if it receives a coroutine-object.
'yield from' can also be modified to only accept coroutine objects if it is called from a generator with CO_COROUTINE flag.
This will further separate coroutines from generators, making it harder to screw something up by an accident.
I have a branch of reference implementation https://github.com/1st1/cpython/tree/await_noiter where this is implemented. I did not observe any performance drop.
There is just one possible backwards compatibility issue here: there will be an exception if some user of asyncio actually used to iterate over generators decorated with @coroutine. But I can't imagine why would someone do that, and even if they did -- it's probably a bug or wrong usage of asyncio.
That's it! I'd be happy to hear some feedback!
Thanks, Yury
PEP: 492 Title: Coroutines with async and await syntax Version: $Revision$ Last-Modified: $Date$ Author: Yury Selivanov <yselivanov@sprymix.com> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Apr-2015 Python-Version: 3.5 Post-History: 17-Apr-2015, 21-Apr-2015
Abstract ========
This PEP introduces new syntax for coroutines, asynchronous ``with`` statements and ``for`` loops. The main motivation behind this proposal is to streamline writing and maintaining asynchronous code, as well as to simplify previously hard to implement code patterns.
Rationale and Goals ===================
Current Python supports implementing coroutines via generators (PEP 342), further enhanced by the ``yield from`` syntax introduced in PEP 380. This approach has a number of shortcomings:
* it is easy to confuse coroutines with regular generators, since they share the same syntax; async libraries often attempt to alleviate this by using decorators (e.g. ``@asyncio.coroutine`` [1]_);
* it is not possible to natively define a coroutine which has no ``yield`` or ``yield from`` statements, again requiring the use of decorators to fix potential refactoring issues;
* support for asynchronous calls is limited to expressions where ``yield`` is allowed syntactically, limiting the usefulness of syntactic features, such as ``with`` and ``for`` statements.
This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring.
Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new ``async with`` statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new ``async for`` statement makes it possible to perform asynchronous calls in iterators.
Specification =============
This proposal introduces new syntax and semantics to enhance coroutine support in Python, it does not change the internal implementation of coroutines, which are still based on generators.
It is strongly suggested that the reader understands how coroutines are implemented in Python (PEP 342 and PEP 380). It is also recommended to read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions).
From this point in this document we use the word *coroutine* to refer to functions declared using the new syntax. *generator-based coroutine* is used where necessary to refer to coroutines that are based on generator syntax.
New Coroutine Declaration Syntax --------------------------------
The following new syntax is used to declare a coroutine::
async def read_data(db): pass
Key properties of coroutines:
* ``async def`` functions are always coroutines, even if they do not contain ``await`` expressions.
* It is a ``SyntaxError`` to have ``yield`` or ``yield from`` expressions in an ``async`` function.
* Internally, a new code object flag - ``CO_COROUTINE`` - is introduced to enable runtime detection of coroutines (and migrating existing code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` flags set.
* Regular generators, when called, return a *generator object*; similarly, coroutines return a *coroutine object*.
* ``StopIteration`` exceptions are not propagated out of coroutines, and are replaced with a ``RuntimeError``. For regular generators such behavior requires a future import (see PEP 479).
types.coroutine() -----------------
A new function ``coroutine(gen)`` is added to the ``types`` module. It applies ``CO_COROUTINE`` flag to the passed generator-function's code object, making it to return a *coroutine object* when called.
This feature enables an easy upgrade path for existing libraries.
Await Expression ----------------
The following new ``await`` expression is used to obtain a result of coroutine execution::
async def read_data(db): data = await db.fetch('SELECT ...') ...
``await``, similarly to ``yield from``, suspends execution of ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and returns the result data.
It uses the ``yield from`` implementation with an extra step of validating its argument. ``await`` only accepts an *awaitable*, which can be one of:
* A *coroutine object* returned from a *coroutine* or a generator decorated with ``types.coroutine()``.
* An object with an ``__await__`` method returning an iterator.
Any ``yield from`` chain of calls ends with a ``yield``. This is a fundamental mechanism of how *Futures* are implemented. Since, internally, coroutines are a special kind of generators, every ``await`` is suspended by a ``yield`` somewhere down the chain of ``await`` calls (please refer to PEP 3156 for a detailed explanation.)
To enable this behavior for coroutines, a new magic method called ``__await__`` is added. In asyncio, for instance, to enable Future objects in ``await`` statements, the only change is to add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
Objects with ``__await__`` method are called *Future-like* objects in the rest of this PEP.
Also, please note that ``__aiter__`` method (see its definition below) cannot be used for this purpose. It is a different protocol, and would be like using ``__iter__`` instead of ``__call__`` for regular callables.
It is a ``SyntaxError`` to use ``await`` outside of a coroutine.
Asynchronous Context Managers and "async with" ----------------------------------------------
An *asynchronous context manager* is a context manager that is able to suspend execution in its *enter* and *exit* methods.
To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: ``__aenter__`` and ``__aexit__``. Both must return an *awaitable*.
An example of an asynchronous context manager::
class AsyncContextManager: async def __aenter__(self): await log('entering context')
async def __aexit__(self, exc_type, exc, tb): await log('exiting context')
New Syntax ''''''''''
A new statement for asynchronous context managers is proposed::
async with EXPR as VAR: BLOCK
which is semantically equivalent to::
mgr = (EXPR) aexit = type(mgr).__aexit__ aenter = type(mgr).__aenter__(mgr) exc = True
try: try: VAR = await aenter BLOCK except: exc = False exit_res = await aexit(mgr, *sys.exc_info()) if not exit_res: raise
finally: if exc: await aexit(mgr, None, None, None)
As with regular ``with`` statements, it is possible to specify multiple context managers in a single ``async with`` statement.
It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine.
Example '''''''
With asynchronous context managers it is easy to implement proper database transaction managers for coroutines::
async def commit(session, data): ...
async with session.transaction(): ... await session.update(data) ...
Code that needs locking also looks lighter::
async with lock: ...
instead of::
with (yield from lock): ...
Asynchronous Iterators and "async for" --------------------------------------
An *asynchronous iterable* is able to call asynchronous code in its *iter* implementation, and *asynchronous iterator* can call asynchronous code in its *next* method. To support asynchronous iteration:
1. An object must implement an ``__aiter__`` method returning an *awaitable* resulting in an *asynchronous iterator object*.
2. An *asynchronous iterator object* must implement an ``__anext__`` method returning an *awaitable*.
3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` exception.
An example of asynchronous iterable::
class AsyncIterable: async def __aiter__(self): return self
async def __anext__(self): data = await self.fetch_data() if data: return data else: raise StopAsyncIteration
async def fetch_data(self): ...
New Syntax ''''''''''
A new statement for iterating through asynchronous iterators is proposed::
async for TARGET in ITER: BLOCK else: BLOCK2
which is semantically equivalent to::
iter = (ITER) iter = await type(iter).__aiter__(iter) running = True while running: try: TARGET = await type(iter).__anext__(iter) except StopAsyncIteration: running = False else: BLOCK else: BLOCK2
It is an error to pass a regular iterable without ``__aiter__`` method to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside of a coroutine.
As for with regular ``for`` statement, ``async for`` has an optional ``else`` clause.
Example 1 '''''''''
With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration::
async for data in cursor: ...
Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows of data from a database after every ``N`` iterations.
The following code illustrates new asynchronous iteration protocol::
class Cursor: def __init__(self): self.buffer = collections.deque()
def _prefetch(self): ...
async def __aiter__(self): return self
async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopAsyncIteration return self.buffer.popleft()
then the ``Cursor`` class can be used as follows::
async for row in Cursor(): print(row)
which would be equivalent to the following code::
i = await Cursor().__aiter__() while True: try: row = await i.__anext__() except StopAsyncIteration: break else: print(row)
Example 2 '''''''''
The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators.
::
class AsyncIteratorWrapper: def __init__(self, obj): self._it = iter(obj)
async def __aiter__(self): return self
async def __anext__(self): try: value = next(self._it) except StopIteration: raise StopAsyncIteration return value
async for item in AsyncIteratorWrapper("abc"): print(item)
Why StopAsyncIteration? '''''''''''''''''''''''
Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between
::
def g1(): yield from fut return 'spam'
and
::
def g2(): yield from fut raise StopIteration('spam')
And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its ``StopIteration`` wrapped into a ``RuntimeError``
::
async def a1(): await fut raise StopIteration('spam')
The only way to tell the outside code that the iteration has ended is to raise something other than ``StopIteration``. Therefore, a new built-in exception class ``StopAsyncIteration`` was added.
Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions raised in coroutines are wrapped in ``RuntimeError``.
Debugging Features ------------------
One of the most frequent mistakes that people make when using generators as coroutines is forgetting to use ``yield from``::
@asyncio.coroutine def useful(): asyncio.sleep(1) # this will do noting without 'yield from'
For debugging this kind of mistakes there is a special debug mode in asyncio, in which ``@coroutine`` decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient ``__repr__`` function with detailed information about the generator.
The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, ``@coroutine`` decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is possible to run asyncio programs with asyncio's own functions instrumented. ``EventLoop.set_debug``, a different debug facility, has no impact on ``@coroutine`` decorator's behavior.
With this proposal, coroutines is a native, distinct from generators, concept. New methods ``set_coroutine_wrapper`` and ``get_coroutine_wrapper`` are added to the ``sys`` module, with which frameworks can provide advanced debugging facilities.
It is also important to make coroutines as fast and efficient as possible, therefore there are no debug features enabled by default.
Example::
async def debug_me(): await asyncio.sleep(1)
def async_debug_wrap(generator): return asyncio.AsyncDebugWrapper(generator)
sys.set_coroutine_wrapper(async_debug_wrap)
debug_me() # <- this line will likely GC the coroutine object and # trigger AsyncDebugWrapper's code.
assert isinstance(debug_me(), AsyncDebugWrapper)
sys.set_coroutine_wrapper(None) # <- this unsets any # previously set wrapper assert not isinstance(debug_me(), AsyncDebugWrapper)
If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` unsets the wrapper.
Glossary ========
:Coroutine: A coroutine function, or just "coroutine", is declared with ``async def``. It uses ``await`` and ``return value``; see `New Coroutine Declaration Syntax`_ for details.
:Coroutine object: Returned from a coroutine function. See `Await Expression`_ for details.
:Future-like object: An object with an ``__await__`` method. Can be consumed by an ``await`` expression in a coroutine. A coroutine waiting for a Future-like object is suspended until the Future-like object's ``__await__`` completes, and returns the result. See `Await Expression`_ for details.
:Awaitable: A *Future-like* object or a *coroutine object*. See `Await Expression`_ for details.
:Generator-based coroutine: Coroutines based in generator syntax. Most common example is ``@asyncio.coroutine``.
:Asynchronous context manager: An asynchronous context manager has ``__aenter__`` and ``__aexit__`` methods and can be used with ``async with``. See `Asynchronous Context Managers and "async with"`_ for details.
:Asynchronous iterable: An object with an ``__aiter__`` method, which must return an *asynchronous iterator* object. Can be used with ``async for``. See `Asynchronous Iterators and "async for"`_ for details.
:Asynchronous iterator: An asynchronous iterator has an ``__anext__`` method. See `Asynchronous Iterators and "async for"`_ for details.
List of functions and methods =============================
================= =================================== ================= Method Can contain Can't contain ================= =================================== ================= async def func await, return value yield, yield from async def __a*__ await, return value yield, yield from def __a*__ return awaitable await def __await__ yield, yield from, return iterable await generator yield, yield from, return value await ================= =================================== =================
Where:
* "async def func": coroutine;
* "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined with the ``async`` keyword;
* "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined without the ``async`` keyword, must return an *awaitable*;
* "def __await__": ``__await__`` method to implement *Future-like* objects;
* generator: a "regular" generator, function defined with ``def`` and which contains a least one ``yield`` or ``yield from`` expression.
Transition Plan ===============
To avoid backwards compatibility issues with ``async`` and ``await`` keywords, it was decided to modify ``tokenizer.c`` in such a way, that it:
* recognizes ``async def`` name tokens combination (start of a coroutine);
* keeps track of regular functions and coroutines;
* replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with ``AWAIT`` when in the process of yielding tokens for coroutines.
This approach allows for seamless combination of new syntax features (all of them available only in ``async`` functions) with any existing code.
An example of having "async def" and "async" attribute in one piece of code::
class Spam: async = 42
async def ham(): print(getattr(Spam, 'async'))
# The coroutine can be executed and will print '42'
Backwards Compatibility -----------------------
This proposal preserves 100% backwards compatibility.
Grammar Updates ---------------
Grammar changes are also fairly minimal::
await_expr: AWAIT test await_stmt: await_expr
decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: ASYNC funcdef
async_stmt: ASYNC (funcdef | with_stmt | for_stmt)
compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt)
atom: ('(' [yield_expr|await_expr|testlist_comp] ')' | '[' [testlist_comp] ']' | '{' [dictorsetmaker] '}' | NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False’)
expr_stmt: testlist_star_expr (augassign (yield_expr|await_expr|testlist) | ('=' (yield_expr|await_expr|testlist_star_expr))*)
Transition Period Shortcomings ------------------------------
There is just one.
Until ``async`` and ``await`` are not proper keywords, it is not possible (or at least very hard) to fix ``tokenizer.c`` to recognize them on the **same line** with ``def`` keyword::
# async and await will always be parsed as variables
async def outer(): # 1 def nested(a=(await fut)): pass
async def foo(): return (await fut) # 2
Since ``await`` and ``async`` in such cases are parsed as ``NAME`` tokens, a ``SyntaxError`` will be raised.
To workaround these issues, the above examples can be easily rewritten to a more readable form::
async def outer(): # 1 a_default = await fut def nested(a=a_default): pass
async def foo(): # 2 return (await fut)
This limitation will go away as soon as ``async`` and ``await`` ate proper keywords. Or if it's decided to use a future import for this PEP.
Deprecation Plans -----------------
``async`` and ``await`` names will be softly deprecated in CPython 3.5 and 3.6. In 3.7 we will transform them to proper keywords. Making ``async`` and ``await`` proper keywords before 3.7 might make it harder for people to port their code to Python 3.
asyncio -------
``asyncio`` module was adapted and tested to work with coroutines and new statements. Backwards compatibility is 100% preserved.
The required changes are mainly:
1. Modify ``@asyncio.coroutine`` decorator to use new ``types.coroutine()`` function.
2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class.
3. Add ``ensure_task()`` as an alias for ``async()`` function. Deprecate ``async()`` function.
Design Considerations =====================
PEP 3152 --------
PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called "cofunctions"). Some key points:
1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is always a generator, even if there is no ``cocall`` expressions inside it. Maps to ``async def`` in this proposal.
2. A new keyword ``cocall`` to call a *cofunction*. Can only be used inside a *cofunction*. Maps to ``await`` in this proposal (with some differences, see below.)
3. It is not possible to call a *cofunction* without a ``cocall`` keyword.
4. ``cocall`` grammatically requires parentheses after it::
atom: cocall | <existing alternatives for atom> cocall: 'cocall' atom cotrailer* '(' [arglist] ')' cotrailer: '[' subscriptlist ']' | '.' NAME
5. ``cocall f(*args, **kwds)`` is semantically equivalent to ``yield from f.__cocall__(*args, **kwds)``.
Differences from this proposal:
1. There is no equivalent of ``__cocall__`` in this PEP, which is called and its result is passed to ``yield from`` in the ``cocall`` expression. ``await`` keyword expects an *awaitable* object, validates the type, and executes ``yield from`` on it. Although, ``__await__`` method is similar to ``__cocall__``, but is only used to define *Future-like* objects.
2. ``await`` is defined in almost the same way as ``yield from`` in the grammar (it is later enforced that ``await`` can only be inside ``async def``). It is possible to simply write ``await future``, whereas ``cocall`` always requires parentheses.
3. To make asyncio work with PEP 3152 it would be required to modify ``@asyncio.coroutine`` decorator to wrap all functions in an object with a ``__cocall__`` method. To call *cofunctions* from existing generator-based coroutines it would be required to use ``costart`` built-in. In this proposal ``@asyncio.coroutine`` simply sets ``CO_COROUTINE`` on the wrapped function's code object and everything works automatically.
4. Since it is impossible to call a *cofunction* without a ``cocall`` keyword, it automatically prevents the common mistake of forgetting to use ``yield from`` on generator-based coroutines. This proposal addresses this problem with a different approach, see `Debugging Features`_.
5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine is that if is decided to implement coroutine-generators -- coroutines with ``yield`` or ``async yield`` expressions -- we wouldn't need a ``cocall`` keyword to call them. So we'll end up having ``__cocall__`` and no ``__call__`` for regular coroutines, and having ``__call__`` and no ``__cocall__`` for coroutine- generators.
6. There are no equivalents of ``async for`` and ``async with`` in PEP 3152.
Coroutine-generators --------------------
With ``async for`` keyword it is desirable to have a concept of a *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` expressions. To avoid any ambiguity with regular generators, we would likely require to have an ``async`` keyword before ``yield``, and ``async yield from`` would raise a ``StopAsyncIteration`` exception.
While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP.
No implicit wrapping in Futures -------------------------------
There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A key difference is that JavaScript "async functions" always return a Promise. While this approach has some advantages, it also implies that a new Promise object is created on each "async function" invocation.
We could implement a similar functionality in Python, by wrapping all coroutines in a Future object, but this has the following disadvantages:
1. Performance. A new Future object would be instantiated on each coroutine call. Moreover, this makes implementation of ``await`` expressions slower (disabling optimizations of ``yield from``).
2. A new built-in ``Future`` object would need to be added.
3. Coming up with a generic ``Future`` interface that is usable for any use case in any framework is a very hard to solve problem.
4. It is not a feature that is used frequently, when most of the code is coroutines.
Why "async" and "await" keywords --------------------------------
async/await is not a new concept in programming languages:
* C# has it since long time ago [5]_;
* proposal to add async/await in ECMAScript 7 [2]_; see also Traceur project [9]_;
* Facebook's Hack/HHVM [6]_;
* Google's Dart language [7]_;
* Scala [8]_;
* proposal to add async/await to C++ [10]_;
* and many other less popular languages.
This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance).
Why "__aiter__" is a coroutine ------------------------------
In principle, ``__aiter__`` could be a regular function. There are several good reasons to make it a coroutine:
* as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` methods are coroutines, users would often make a mistake defining it as ``async`` anyways;
* there might be a need to run some asynchronous operations in ``__aiter__``, for instance to prepare DB queries or do some file operation.
Importance of "async" keyword -----------------------------
While it is possible to just implement ``await`` expression and treat all functions with at least one ``await`` as coroutines, this approach makes APIs design, code refactoring and its long time support harder.
Let's pretend that Python only has ``await`` keyword::
def useful(): ... await log(...) ...
def important(): await useful()
If ``useful()`` function is refactored and someone removes all ``await`` expressions from it, it would become a regular python function, and all code that depends on it, including ``important()`` would be broken. To mitigate this issue a decorator similar to ``@asyncio.coroutine`` has to be introduced.
Why "async def" ---------------
For some people bare ``async name(): pass`` syntax might look more appealing than ``async def name(): pass``. It is certainly easier to type. But on the other hand, it breaks the symmetry between ``async def``, ``async with`` and ``async for``, where ``async`` is a modifier, stating that the statement is asynchronous. It is also more consistent with the existing grammar.
Why not a __future__ import ---------------------------
``__future__`` imports are inconvenient and easy to forget to add. Also, they are enabled for the whole source file. Consider that there is a big project with a popular module named "async.py". With future imports it is required to either import it using ``__import__()`` or ``importlib.import_module()`` calls, or to rename the module. The proposed approach makes it possible to continue using old code and modules without a hassle, while coming up with a migration plan for future python versions.
Why magic methods start with "a" --------------------------------
New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". An alternative proposal is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``. However, to align new magic methods with the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided to use a shorter version.
Why not reuse existing magic names ----------------------------------
An alternative idea about new asynchronous iterators and context managers was to reuse existing magic methods, by adding an ``async`` keyword to their declarations::
class CM: async def __enter__(self): # instead of __aenter__ ...
This approach has the following downsides:
* it would not be possible to create an object that works in both ``with`` and ``async with`` statements;
* it would look confusing and would require some implicit magic behind the scenes in the interpreter;
* one of the main points of this proposal is to make coroutines as simple and foolproof as possible.
Comprehensions --------------
For the sake of restricting the broadness of this PEP there is no new syntax for asynchronous comprehensions. This should be considered in a separate PEP, if there is a strong demand for this feature.
Async lambdas -------------
Lambda coroutines are not part of this proposal. In this proposal they would look like ``async lambda(parameters): expression``. Unless there is a strong demand to have them as part of this proposal, it is recommended to consider them later in a separate PEP.
Performance ===========
Overall Impact --------------
This proposal introduces no observable performance impact. Here is an output of python's official set of benchmarks [4]_:
::
python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe
[skipped]
Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64 i386
Total CPU cores: 8
### etree_iterparse ### Min: 0.365359 -> 0.349168: 1.05x faster Avg: 0.396924 -> 0.379735: 1.05x faster Significant (t=9.71) Stddev: 0.01225 -> 0.01277: 1.0423x larger
The following not significant results are hidden, use -v to show them: django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle, fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.
Tokenizer modifications -----------------------
There is no observable slowdown of parsing python files with the modified tokenizer: parsing of one 12Mb file (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount of time.
async/await -----------
The following micro-benchmark was used to determine performance difference between "async" functions and generators::
import sys import time
def binary(n): if n <= 0: return 1 l = yield from binary(n - 1) r = yield from binary(n - 1) return l + 1 + r
async def abinary(n): if n <= 0: return 1 l = await abinary(n - 1) r = await abinary(n - 1) return l + 1 + r
def timeit(gen, depth, repeat): t0 = time.time() for _ in range(repeat): list(gen(depth)) t1 = time.time() print('{}({}) * {}: total {:.3f}s'.format( gen.__name__, depth, repeat, t1-t0))
The result is that there is no observable performance difference. Minimum timing of 3 runs
::
abinary(19) * 30: total 12.985s binary(19) * 30: total 12.953s
Note that depth of 19 means 1,048,575 calls.
Reference Implementation ========================
The reference implementation can be found here: [3]_.
List of high-level changes and new protocols --------------------------------------------
1. New syntax for defining coroutines: ``async def`` and new ``await`` keyword.
2. New ``__await__`` method for Future-like objects.
3. New syntax for asynchronous context managers: ``async with``. And associated protocol with ``__aenter__`` and ``__aexit__`` methods.
4. New syntax for asynchronous iteration: ``async for``. And associated protocol with ``__aiter__``, ``__aexit__`` and new built- in exception ``StopAsyncIteration``.
5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, ``Await``.
6. New functions: ``sys.set_coroutine_wrapper(callback)``, ``sys.get_coroutine_wrapper()``, and ``types.coroutine(gen)``.
7. New ``CO_COROUTINE`` bit flag for code objects.
While the list of changes and new things is not short, it is important to understand, that most users will not use these features directly. It is intended to be used in frameworks and libraries to provide users with convenient to use and unambiguous APIs with ``async def``, ``await``, ``async for`` and ``async with`` syntax.
Working example ---------------
All concepts proposed in this PEP are implemented [3]_ and can be tested.
::
import asyncio
async def echo_server(): print('Serving on localhost:8000') await asyncio.start_server(handle_connection, 'localhost', 8000)
async def handle_connection(reader, writer): print('New connection...')
while True: data = await reader.read(8192)
if not data: break
print('Sending {:.10}... back'.format(repr(data))) writer.write(data)
loop = asyncio.get_event_loop() loop.run_until_complete(echo_server()) try: loop.run_forever() finally: loop.close()
References ==========
.. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
.. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions
.. [3] https://github.com/1st1/cpython/tree/await
.. [4] https://hg.python.org/benchmarks
.. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx
.. [6] http://docs.hhvm.com/manual/en/hack.async.php
.. [7] https://www.dartlang.org/articles/await-async/
.. [8] http://docs.scala-lang.org/sips/pending/async.html
.. [9] https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-funct...
.. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)
Acknowledgments ===============
I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, and Łukasz Langa for their initial feedback.
Copyright =========
This document has been placed in the public domain.
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/tds333%2Bpydev%40gmail.co...
-- bye by Wolfgang
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

Hello, On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote: []
3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad. I know I'm a bad guy to make such comments, too bad there's a bit of truth in them, or everyone would just call me an a%$&ole right away. Generally, I already provided feedback (on asyncio list) that asyncio is based not on native Python concepts like a coroutine, but on foreign concurrency concepts like callback or Future, and a coroutine is fitted as a second-class citizen on top of that. I understand why that was done - to not leave out all those twisteds from a shiny new world of asyncio, but sometimes one may wonder if having a clear cut would've helped (compat could then have been added as a clearly separate subpackage, implemented in terms of coroutines). Now people coming from non-coroutine frameworks who were promised compatibility see "bad" things in asyncio (and related areas), and people lured by a promise of native Python framework see bad things too. -- Best regards, Paul mailto:pmiscml@gmail.com

On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com> wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong. I had also a need for async loop. But there are other solutions like channels, not needing a new syntax. Also possible a function returning futures and yield in the loop with a sentinel. All this goes the road down to a producer consumer pattern. Nothing more.
I know I'm a bad guy to make such comments, too bad there's a bit of truth in them, or everyone would just call me an a%$&ole right away.
Generally, I already provided feedback (on asyncio list) that asyncio is based not on native Python concepts like a coroutine, but on foreign concurrency concepts like callback or Future, and a coroutine is fitted as a second-class citizen on top of that. I understand why that was done - to not leave out all those twisteds from a shiny new world of asyncio, but sometimes one may wonder if having a clear cut would've helped (compat could then have been added as a clearly separate subpackage, implemented in terms of coroutines). Now people coming from non-coroutine frameworks who were promised compatibility see "bad" things in asyncio (and related areas), and people lured by a promise of native Python framework see bad things too.
This has nothing to do with people using twisted or other async frameworks like tornado. I think a coroutine should be first class. But all this should be done in a way a beginner can handle and not design this stuff for experts only. If we do this we scare away new people. This can be done step by step. No need to hurry. And finally we have stackless Python but explicit. ;-) -- bye by Wolfgang

On Thu, Apr 23, 2015 at 3:27 PM, Wolfgang Langner <tds333+pydev@gmail.com> wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com> wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong.
I had also a need for async loop. But there are other solutions like channels, not needing a new syntax.
By `channels` do you mean something like `asyncio.Queue`? It requires that producer and consumer should be separate tasks. Often it works (with some performance penalty cost) but creating 2 tasks is not always obvious way to solve problem.
Also possible a function returning futures and yield in the loop with a sentinel. A proposal looks like guess to avoid `for` loop and use `while` everywhere.
Just compare `while` loop: it = iter(it) while True: try: i = next(it) process(i) except StopIteration: break with `for` alternative: for i in it: process(i)
All this goes the road down to a producer consumer pattern. Nothing more.
I think one of the most convenient consumer-producer pattern implementation in Python is `for` loop and iterators concept. It's sometimes too limited but works pretty well in 95% of use cases.
-- Thanks, Andrew Svetlov

Hi Wolfgang, On 2015-04-23 8:27 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com> wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years. async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong. There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with.
transaction = yield from connection.transaction() try: ... except: yield from transaction.rollback() else: yield from transaction.commit() is certainly more irritating than async with connection.transcation(): ...
I had also a need for async loop. But there are other solutions like channels, not needing a new syntax.
Also possible a function returning futures and yield in the loop with a sentinel.
All this goes the road down to a producer consumer pattern. Nothing more.
I know I'm a bad guy to make such comments, too bad there's a bit of truth in them, or everyone would just call me an a%$&ole right away.
Generally, I already provided feedback (on asyncio list) that asyncio is based not on native Python concepts like a coroutine, but on foreign concurrency concepts like callback or Future, and a coroutine is fitted as a second-class citizen on top of that. I understand why that was done - to not leave out all those twisteds from a shiny new world of asyncio, but sometimes one may wonder if having a clear cut would've helped (compat could then have been added as a clearly separate subpackage, implemented in terms of coroutines). Now people coming from non-coroutine frameworks who were promised compatibility see "bad" things in asyncio (and related areas), and people lured by a promise of native Python framework see bad things too.
This has nothing to do with people using twisted or other async frameworks like tornado. I think a coroutine should be first class. But all this should be done in a way a beginner can handle and not design this stuff for experts only.
I think that most of async frameworks out there are for experts only. Precisely because of 'yield from', 'yield', inlineCallbacks, '@coroutine', channels and other stuff. PEP 492 will make it all easier. And Twisted can use its features too.
If we do this we scare away new people.
It doesn't scare away anyone. async/await were the most awaited features in dart and javascript. One of the most popular features in c#. Yury

On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Wolfgang,
On 2015-04-23 8:27 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com> wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3.
async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful
and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong.
There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with.
transaction = yield from connection.transaction() try: ... except: yield from transaction.rollback() else: yield from transaction.commit()
is certainly more irritating than
async with connection.transcation(): ...
I had also a need for async loop. But there are other solutions like channels, not needing a new syntax.
Also possible a function returning futures and yield in the loop with a sentinel.
All this goes the road down to a producer consumer pattern. Nothing more.
truth in them, or everyone would just call me an a%$&ole right away.
Generally, I already provided feedback (on asyncio list) that asyncio is based not on native Python concepts like a coroutine, but on foreign concurrency concepts like callback or Future, and a coroutine is fitted as a second-class citizen on top of that. I understand why that was done - to not leave out all those twisteds from a shiny new world of asyncio, but sometimes one may wonder if having a clear cut would've helped (compat could then have been added as a clearly separate subpackage, implemented in terms of coroutines). Now people coming from non-coroutine frameworks who were promised compatibility see "bad" things in asyncio (and related areas), and people lured by a promise of native Python framework see bad things too.
This has nothing to do with people using twisted or other async
I know I'm a bad guy to make such comments, too bad there's a bit of frameworks like tornado. I think a coroutine should be first class. But all this should be done in a way a beginner can handle and not design this stuff for experts only.
I think that most of async frameworks out there are for experts only. Precisely because of 'yield from', 'yield', inlineCallbacks, '@coroutine', channels and other stuff.
PEP 492 will make it all easier. And Twisted can use its features too.
Yes and it is good to make it easier. But not complicate it for others. Beginners will be confronted with all this new syntax an my feel lost. Oh I have to different for loops, one with async. Same for with statement.
If we do this we
scare away new people.
It doesn't scare away anyone. async/await were the most awaited features in dart and javascript. One of the most popular features in c#.
I like it in C#. I like await for Python but I don't like async there and how to specify it. I still think a decorator is enough and no special for and with syntax. async in JavaScript is for execution a whole script asynchronously used in the script tag. dart is for the google universe with less usage outside. -- bye by Wolfgang

Hi Wolfgang, On 2015-04-23 11:57 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Wolfgang,
On 2015-04-23 8:27 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com> wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3.
async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong.
There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with.
transaction = yield from connection.transaction() try: ... except: yield from transaction.rollback() else: yield from transaction.commit()
is certainly more irritating than
async with connection.transcation(): ...
I had also a need for async loop. But there are other solutions like channels, not needing a new syntax.
Also possible a function returning futures and yield in the loop with a sentinel.
All this goes the road down to a producer consumer pattern. Nothing more.
truth in them, or everyone would just call me an a%$&ole right away.
Generally, I already provided feedback (on asyncio list) that asyncio is based not on native Python concepts like a coroutine, but on foreign concurrency concepts like callback or Future, and a coroutine is fitted as a second-class citizen on top of that. I understand why that was done - to not leave out all those twisteds from a shiny new world of asyncio, but sometimes one may wonder if having a clear cut would've helped (compat could then have been added as a clearly separate subpackage, implemented in terms of coroutines). Now people coming from non-coroutine frameworks who were promised compatibility see "bad" things in asyncio (and related areas), and people lured by a promise of native Python framework see bad things too.
This has nothing to do with people using twisted or other async
I know I'm a bad guy to make such comments, too bad there's a bit of frameworks like tornado. I think a coroutine should be first class. But all this should be done in a way a beginner can handle and not design this stuff for experts only.
I think that most of async frameworks out there are for experts only. Precisely because of 'yield from', 'yield', inlineCallbacks, '@coroutine', channels and other stuff.
PEP 492 will make it all easier. And Twisted can use its features too.
Yes and it is good to make it easier. But not complicate it for others. Beginners will be confronted with all this new syntax an my feel lost. Oh I have to different for loops, one with async. Same for with statement.
Absolute beginners don't write async code and http servers. And when they start doing things like that, they're not someone who can't understand 'async' and 'await'. It's not harder than '@coroutine' and 'yield from'. What is hard for even experienced users, is to understand what's the difference between 'yield from' and 'yield', and how asyncio works in the core. Why you should always use the former etc. Unfortunately I just can't agree with you here. Too much time was spent trying to explain people how to write asyncio code, and it's hard. PEP 492 will make it easier.
If we do this we
scare away new people.
It doesn't scare away anyone. async/await were the most awaited features in dart and javascript. One of the most popular features in c#.
I like it in C#. I like await for Python but I don't like async there and how to specify it. I still think a decorator is enough and no special for and with syntax.
Please read the "Debugging Features" section in the PEP. It explains why decorator is not enough. Also the "Rationale" section also stresses some points. Decorator solution is "enough", but it's far from being ideal. For IDEs and linters it's harder to support. What if you write "from asyncio import coroutine as coro"? You have to analyze the code statically to reason about what "@coroutine" is. Moreover, you need to enable good IDE support for other frameworks too.
async in JavaScript is for execution a whole script asynchronously used in the script tag. dart is for the google universe with less usage outside.
Please read a proposal to add async/await in JS (refd in the PEP). It's very likely to be accepted, because JS is asynchronous to its very core, and it's a pain to program in it with callbacks. yields in JS will mitigate the problem but not completely, so it's only matter of time. In the PEP there is a good list of other languages with async/await. Thank you, Yury

On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Wolfgang,
On 2015-04-23 8:27 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com> wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3.
async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful
and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong.
There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with.
Don't mean it can be done wrong on execution or syntax level. I mean for a beginner it is not as easy an more and some will try async in some places, yes they will get an error. But there is a new possibility to get such errors if async is there for with and for statements. And the next beginner will then implement __aiter__ instead of __iter__ because he/she don't get it. On the other side I like "await" and __await__ implementation. Symmetric good easy to explain, same with "int" and "__int__" and all others.
transaction = yield from connection.transaction() try: ... except: yield from transaction.rollback() else: yield from transaction.commit()
is certainly more irritating than
async with connection.transcation(): ...
Sorry till now I use async stuff and database access and do it in an extra thread in sync mode. No performance problems and can use all good maintained database libraries. Also twisteds RDBMS (adbapi) is enough here. First I thought it is not enough or to slow but this was not the case. https://twistedmatrix.com/documents/current/core/howto/rdbms.html Here I am on line with Mike Bayer: http://techspot.zzzeek.org/2015/02/15/asynchronous-python-and-databases/

Wolfgang, On 2015-04-23 12:12 PM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Wolfgang,
On 2015-04-23 8:27 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com> wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3.
async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong.
There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with.
Don't mean it can be done wrong on execution or syntax level. I mean for a beginner it is not as easy an more and some will try async in some places, yes they will get an error. But there is a new possibility to get such errors if async is there for with and for statements. And the next beginner will then implement __aiter__ instead of __iter__ because he/she don't get it.
Sorry, Wolfgang, but I don't get your argument. Beginners shouldn't just randomly try to use statements. There is a documentation for that. Plus we can make exception messages better.
On the other side I like "await" and __await__ implementation. Symmetric good easy to explain, same with "int" and "__int__" and all others.
Glad to hear that! ;)
transaction = yield from connection.transaction() try: ... except: yield from transaction.rollback() else: yield from transaction.commit()
is certainly more irritating than
async with connection.transcation(): ...
Sorry till now I use async stuff and database access and do it in an extra thread in sync mode. No performance problems and can use all good maintained database libraries. Also twisteds RDBMS (adbapi) is enough here. First I thought it is not enough or to slow but this was not the case. https://twistedmatrix.com/documents/current/core/howto/rdbms.html
Here I am on line with Mike Bayer: http://techspot.zzzeek.org/2015/02/15/asynchronous-python-and-databases/
It's good article, I read it. The PEP will help those who don't want to use threads and want to use coroutines. There are no fundamental reasons why coroutines are slower than threads. It will only be improved. There is a fundamental reason to avoid doing threads in python though, because it can harm performance drastically. There are some fundamental reasons why Mike will always love threads though -- we won't ever have asynchronous __getattr__, so SQLAlchemy won't be as elegant as it is in sync mode. Thanks! Yury

On Thu, Apr 23, 2015 at 6:22 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Wolfgang,
On 2015-04-23 12:12 PM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Wolfgang,
On 2015-04-23 8:27 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com>
wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3.
async with and async for > Bead idea, we clutter the language even more and it is one more > thing every newbie could do wrong. > for x in y: > result = await f() > is enough, every 'async' framework lived without it over years. > > async for i in iterable: pass
is not equal for
for fut in iterable: i = yield from fut
But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful
and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong.
There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with.
Don't mean it can be done wrong on execution or syntax level.
I mean for a beginner it is not as easy an more and some will try async in some places, yes they will get an error. But there is a new possibility to get such errors if async is there for with and for statements. And the next beginner will then implement __aiter__ instead of __iter__ because he/she don't get it.
Sorry, Wolfgang, but I don't get your argument. Beginners shouldn't just randomly try to use statements. There is a documentation for that. Plus we can make exception messages better.
Had to coach a lot of new users to Python and some to async stuff in twisted. And what beginners not should do don't care them. ;-) They will do strange stuff and go other way's than you expected. I only like to make it as easy as possible for them. Nothing more. Less keywords, less ways to do something, best only one way to do it. Don't get me wrong, I like the PEP it is well written and covers a lot of areas about async programming. I know Python must improve in this area and has a lot of potential. But don't hesitate, give the people time to try it and mature it. If all this should be in 3.5 it is to early. Also we can avoid the async keyword completely and do the same as for generators. If there is an await, it is a coroutine. IDE's can detect generators and must only be expanded to detect coroutines. Function with yield -> return a generator with await -> return a coroutine Move async for and async with to another PEP and handle it later or with a different syntax using the new await keyword. Only one new keyword, good improvement for async programming. Sorry still don't like the word async in a language and sprinkle it every where. And still have the feeling if we provide async for loop we next must provide async like generator expresions or list comprehensions or ... never ends ;-) The only downside with await converting a function to a coroutine is, it is not explicit marked. But if we care about this, whats about generators and yield ? -- bye by Wolfgang

Wolfgang, On 2015-04-23 12:58 PM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 6:22 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Wolfgang,
On 2015-04-23 12:12 PM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi Wolfgang,
On 2015-04-23 8:27 AM, Wolfgang Langner wrote:
On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky <pmiscml@gmail.com>
wrote:
Hello,
On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov <andrew.svetlov@gmail.com> wrote:
[]
3.
> async with and async for >> Bead idea, we clutter the language even more and it is one more >> thing every newbie could do wrong. >> for x in y: >> result = await f() >> is enough, every 'async' framework lived without it over years. >> >> async for i in iterable: > pass > > is not equal for > > for fut in iterable: > i = yield from fut > > But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad.
I don't think it is bad nor not needed, but the syntax is not beautiful
and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong.
There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with.
Don't mean it can be done wrong on execution or syntax level. I mean for a beginner it is not as easy an more and some will try async in some places, yes they will get an error. But there is a new possibility to get such errors if async is there for with and for statements. And the next beginner will then implement __aiter__ instead of __iter__ because he/she don't get it.
Sorry, Wolfgang, but I don't get your argument. Beginners shouldn't just randomly try to use statements. There is a documentation for that. Plus we can make exception messages better.
Had to coach a lot of new users to Python and some to async stuff in twisted. And what beginners not should do don't care them. ;-) They will do strange stuff and go other way's than you expected. I only like to make it as easy as possible for them. Nothing more. Less keywords, less ways to do something, best only one way to do it.
Don't get me wrong, I like the PEP it is well written and covers a lot of areas about async programming. I know Python must improve in this area and has a lot of potential. But don't hesitate, give the people time to try it and mature it. If all this should be in 3.5 it is to early.
The thing about this PEP is that it's build on existing concepts that were validated with asyncio. As for is it enough time to review it or not -- it's up to BDFL to decide. I'm doing my best trying to get the reference implementation reviewed and to address all questions in the PEP, hoping that it will help. I can only say that if it doesn't land in 3.5, we'll have to wait another *1.5 years*. And it's not that people will download CPython 3.6.alpha0 and start rewriting their code and playing with it. In the meanwhile, people want more good reasons to migrate to Python 3, and I strongly believe that this PEP is a great reason. Moreover, it's several months before 3.5 is released. We still will be able to slightly alter the behaviour and gather feedback during beta periods.
Also we can avoid the async keyword completely and do the same as for generators. If there is an await, it is a coroutine.
Unfortunately there is a problem with this approach: refactoring of code becomes much harder. That's one of the corner cases that @coroutine decorator solves; see https://www.python.org/dev/peps/pep-0492/#importance-of-async-keyword Thanks, Yury

Hi, On 2015-04-23 3:30 AM, Wolfgang Langner wrote:
Hi,
most of the time I am a silent reader but in this discussion I must step in. I use twisted and async stuff a lot over years followed development of asyncio closely.
First it is good to do differentiate async coroutines from generators. So everyone can see it and have this in mind and don't mix up booth. It is also easier to explain for new users. Sometimes generators blows their mind and it takes a while to get used to them. Async stuff is even harder.
1. I am fine with using something special instead of "yield" or "yield from" for this. C# "await" is ok.
Everything else suggested complicates the language and makes it harder to read.
2. async def f(): is harder to read and something special also it breaks the symmetry in front (def indent). Also every existing tooling must be changed to support it. Same for def async, def f() async: I thing a decorator is enough here @coroutine def f(): is the best solution to mark something as a coroutine.
You can't combine a keyword (await) with runtime decorator. Also it's harder for tools to support @coroutine / @inlineCallbacks than "async".
3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years.
I only lived without it because I used greenlets for async for's & with's. There must be a native language concept to do these things.
Same for with statement.
The main use case suggested was for database stuff and this is also where most are best with defer something to a thread and keep it none async.
All together it is very late in the development cycle for 3.5 to incorporate such a big change.
The PEP isn't a result of some quick brainstorming. It's a result of long experience using asyncio and working around many painpoints of async programming.
Best is to give all this some more time and defer it to 3.6 and some alpha releases to experiment with.
There is reference implementation. asyncio is fully ported, every package for asyncio should work. You can experiment right now and find a real issue why the PEP doesn't work. Yury

On Apr 21, 2015, at 01:26 PM, Yury Selivanov wrote:
The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email.
There's a lot to like about PEP 492. I only want to mildly bikeshed a bit on the proposed new syntax. Why "async def" and not "def async"? My concern is about existing tools that already know that "def" as the first non-whitespace on the line starts a function/method definition. Think of a regexp in an IDE that searches backwards from the current line to find the function its defined on. Sure, tools can be updated but it is it *necessary* to choose a syntax that breaks tools? def async useful(): seems okay to me. Probably the biggest impact on the PEP would be symmetry with asynchronous with and for. What about: with async lock: and for async data in cursor: That would also preserve at least some behavior of existing tools. Anyway, since the PEP doesn't explicitly describe this as an alternative, I want to bring it up. (I have mild concerns about __a*__ magic methods, since I think they'll be harder to visually separate, but here the PEP does describe the __async_*__ alternatives.) Cheers, -Barry

Hi Barry, On 2015-04-23 1:51 PM, Barry Warsaw wrote:
On Apr 21, 2015, at 01:26 PM, Yury Selivanov wrote:
The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email. There's a lot to like about PEP 492. I only want to mildly bikeshed a bit on the proposed new syntax. Thanks!
Why "async def" and not "def async"?
To my eye 'async def name()', 'async with', 'async for' look better than 'def async name()', 'with async' and 'for async'. But that's highly subjective. I also read "for async item in iter:" as "I'm iterating iter with async item".
My concern is about existing tools that already know that "def" as the first non-whitespace on the line starts a function/method definition. Think of a regexp in an IDE that searches backwards from the current line to find the function its defined on. Sure, tools can be updated but it is it *necessary* to choose a syntax that breaks tools?
def async useful():
seems okay to me.
Probably the biggest impact on the PEP would be symmetry with asynchronous with and for. What about:
with async lock:
and
for async data in cursor:
That would also preserve at least some behavior of existing tools.
Anyway, since the PEP doesn't explicitly describe this as an alternative, I want to bring it up.
(I have mild concerns about __a*__ magic methods, since I think they'll be harder to visually separate, but here the PEP does describe the __async_*__ alternatives.)
Anyways, I'm open to change the order of keywords if most people like it that way. Same for __async_*__ naming. Thanks! Yury

On Apr 23, 2015, at 02:02 PM, Yury Selivanov wrote:
To my eye 'async def name()', 'async with', 'async for' look better than 'def async name()', 'with async' and 'for async'. But that's highly subjective.
Would you be willing to add this as an alternative to the PEP, under the "Why async def" section probably? As with all such bikesheds, Guido will pick the color and we'll ooh and ahh. :) Cheers, -Barry

Barry, On 2015-04-23 2:12 PM, Barry Warsaw wrote:
On Apr 23, 2015, at 02:02 PM, Yury Selivanov wrote:
To my eye 'async def name()', 'async with', 'async for' look better than 'def async name()', 'with async' and 'for async'. But that's highly subjective. Would you be willing to add this as an alternative to the PEP, under the "Why async def" section probably?
As with all such bikesheds, Guido will pick the color and we'll ooh and ahh. :)
Sure! I think it's a great idea: https://hg.python.org/peps/rev/8cb4c0ab0931 https://hg.python.org/peps/rev/ec319bf4c86e Thanks! Yury

Yury Selivanov writes:
To my eye 'async def name()', 'async with', 'async for' look better than 'def async name()', 'with async' and 'for async'. But that's highly subjective.
I'm with Barry on this one as far as looks go. (But count that as a +0, since I'm just a literary critic, I don't use coroutines in anger at present.)
I also read "for async item in iter:" as "I'm iterating iter with async item".
I thought that was precisely the intended semantics: item is available asynchronously. Again, count as a +0. FWIW, etc.

Stephen J. Turnbull wrote:
Yury Selivanov writes:
I also read "for async item in iter:" as "I'm iterating iter with async item".
I thought that was precisely the intended semantics: item is available asynchronously.
The async-at-the-end idea could be used here as well. for item in iter async: ... with something as x async: ... -- Greg

On 2015-04-23 18:51, Barry Warsaw wrote:
On Apr 21, 2015, at 01:26 PM, Yury Selivanov wrote:
The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email.
There's a lot to like about PEP 492. I only want to mildly bikeshed a bit on the proposed new syntax.
Why "async def" and not "def async"?
My concern is about existing tools that already know that "def" as the first non-whitespace on the line starts a function/method definition. Think of a regexp in an IDE that searches backwards from the current line to find the function its defined on. Sure, tools can be updated but it is it *necessary* to choose a syntax that breaks tools?
def async useful():
seems okay to me.
Probably the biggest impact on the PEP would be symmetry with asynchronous with and for. What about:
with async lock:
and
for async data in cursor:
That would also preserve at least some behavior of existing tools.
Anyway, since the PEP doesn't explicitly describe this as an alternative, I want to bring it up.
(I have mild concerns about __a*__ magic methods, since I think they'll be harder to visually separate, but here the PEP does describe the __async_*__ alternatives.)
On the other hand, existing tools might be expecting "def" and "for" to be followed by a name.

On Apr 23, 2015, at 10:51 AM, Barry Warsaw <barry@python.org> wrote:
(I have mild concerns about __a*__ magic methods, since I think they'll be harder to visually separate, but here the PEP does describe the __async_*__ alternatives.)
Has it been historically a problem with __iadd__ vs __radd__ vs __add__, __ior__ vs __ror__ vs __or__, etc.? -- Best regards, Łukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev

Barry Warsaw wrote:
Sure, tools can be updated but it is it *necessary* to choose a syntax that breaks tools?
def async useful():
seems okay to me.
That will break any tool that assumes the word following 'def' is the name of the function being defined. Putting it at the end would seem least likely to cause breakage: def useful() async: -- Greg

On Thu, Apr 23, 2015 at 01:51:52PM -0400, Barry Warsaw wrote:
Why "async def" and not "def async"?
My concern is about existing tools that already know that "def" as the first non-whitespace on the line starts a function/method definition. Think of a regexp in an IDE that searches backwards from the current line to find the function its defined on. Sure, tools can be updated but it is it *necessary* to choose a syntax that breaks tools?
Surely its the other way? If I'm searching for the definition of a function manually, I search for "def spam". `async def spam` will still be found, while `def async spam` will not. It seems to me that tools that search for r"^\s*def\s+spam\s*\(" are going to break whichever choice is made, while a less pedantic search like r"def\s+spam\s*\(" will work only if async comes first. -- Steve

On Fri, Apr 24, 2015 at 09:32:51AM -0400, Barry Warsaw wrote:
On Apr 24, 2015, at 11:17 PM, Steven D'Aprano wrote:
It seems to me that tools that search for r"^\s*def\s+spam\s*\(" are
They would likely search for something like r"^\s*def\s+[a-zA-Z0-9_]+" which will hit "def async spam" but not "async def".
Unless somebody wants to do a survey of editors and IDEs and other tools, arguments about what regex they may or may not use to search for function definitions is an exercise in futility. They may use regexes anchored to the start of the line. They may not. They may deal with "def async" better than "async def", or the other way around. Either way, it's a pretty thin argument for breaking the invariant that the token following `def` is the name of the function. Whatever new syntax is added, something is going to break. -- Steve

Wild idea: Let "@" mean "async" when it's directly in front of a keyword. Then we would have: @def f(): ... @for x in iter: ... @with context as thing: ... -- Greg

I used to think in the same way but found the result looks like Perl (or Haskell), not Python. On Sat, Apr 25, 2015 at 7:47 AM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Wild idea:
Let "@" mean "async" when it's directly in front of a keyword.
Then we would have:
@def f(): ...
@for x in iter: ...
@with context as thing: ...
-- Greg
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.co...
-- Thanks, Andrew Svetlov

On Apr 24, 2015, at 6:32 AM, Barry Warsaw <barry@python.org> wrote:
On Apr 24, 2015, at 11:17 PM, Steven D'Aprano wrote:
It seems to me that tools that search for r"^\s*def\s+spam\s*\(" are
They would likely search for something like r"^\s*def\s+[a-zA-Z0-9_]+" which will hit "def async spam" but not "async def”.
Realistically that can’t be what they’re doing because of multiple string literals, internal-scope functions, etc. But I agree with Steven that guessing here is pointless. More importantly, consider: - if we optimize for some unproven backwards compatibility with tools, we’re sacrificing better readability of “async def foo()” - if that tool wants to work with Python 3.5, it’ll still have to support “await” so we’re going to be incompatible anyway; let alone “async for” and “async with” So all in all, I don’t buy this argument. -- Best regards, Łukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev

Hello, On Fri, 24 Apr 2015 12:04:27 -0700 Łukasz Langa <lukasz@langa.pl> wrote: []
They would likely search for something like r"^\s*def\s+[a-zA-Z0-9_]+" which will hit "def async spam" but not "async def”.
Realistically that can’t be what they’re doing because of multiple string literals, internal-scope functions, etc. But I agree with Steven that guessing here is pointless. More importantly, consider:
- if we optimize for some unproven backwards compatibility with tools, we’re sacrificing better readability of “async def foo()”
Yes, so hopefully another argument prevails: the sooner they break, the sooner they're fixed (no irony here, I really consider it strange to optimize language syntax based on background auxiliary utilities' features or misfeatures). [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Tue, 21 Apr 2015 at 18:27 Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi python-dev,
I'm moving the discussion from python-ideas to here.
The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email.
Hi Yury, Having read this very interesting PEP I would like to make two remarks. I apologise in advance if they are points which have already been discussed. 1. About the 'async for' construct. Each iteration will create a new coroutine object (the one returned by Cursor.__anext__()) and it seems to me that it can be wasteful. In the example given of an 'aiterable' Cursor class, probably a large number of rows will fill the cursor buffer in one call of cursor._prefetch(). However each row that is iterated over will involve the creation execution of a new coroutine object. It seems to me that what is desirable in that case is that all the buffered rows will be iterated over as in a plain for loop. 2. I think the semantics of the new coroutine objects could be defined more clearly in the PEP. Of course they are pretty obvious when you know that the coroutines are meant to replace asyncio.coroutine as described in [1]. I understand that this PEP is mostly for the benefit of asyncio, hence mainly of interest of people who know it. However I think it would be good for it to be more self-contained. I have often read a PEP as an introduction to a new feature of Python. I feel that if I was not familiar with yield from and asyncio I would not be able to understand this PEP, even though potentially one could use the new constructs without knowing anything about them. Cheers, -- Arnaud Delobelle [1] https://docs.python.org/3/library/asyncio-task.html#coroutines

Hi Arnaud, On 2015-04-25 4:47 PM, Arnaud Delobelle wrote:
On Tue, 21 Apr 2015 at 18:27 Yury Selivanov <yselivanov.ml@gmail.com> wrote:
Hi python-dev,
I'm moving the discussion from python-ideas to here.
The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email.
Hi Yury,
Having read this very interesting PEP I would like to make two remarks. I apologise in advance if they are points which have already been discussed.
1. About the 'async for' construct. Each iteration will create a new coroutine object (the one returned by Cursor.__anext__()) and it seems to me that it can be wasteful. In the example given of an 'aiterable' Cursor class, probably a large number of rows will fill the cursor buffer in one call of cursor._prefetch(). However each row that is iterated over will involve the creation execution of a new coroutine object. It seems to me that what is desirable in that case is that all the buffered rows will be iterated over as in a plain for loop.
I agree that creating a new coroutine object is a little bit wasteful. However, the proposed iteration protocol was designed to: 1. Resemble already existing __iter__/__next__/StopIteration protocol; 2. Pave the road to introduce coroutine-generators in the future. We could, in theory, design the protocol to make __anext__ awaitable return a regular iterators (and raise StopAsyncIteration at the end) to make things more efficient, but that would complicate the protocol tremendously, and make it very hard to program and debug. My opinion is that this has to be addressed in 3.6 with coroutine-generators if there is enough interest from Python users.
2. I think the semantics of the new coroutine objects could be defined more clearly in the PEP. Of course they are pretty obvious when you know that the coroutines are meant to replace asyncio.coroutine as described in [1]. I understand that this PEP is mostly for the benefit of asyncio, hence mainly of interest of people who know it. However I think it would be good for it to be more self-contained. I have often read a PEP as an introduction to a new feature of Python. I feel that if I was not familiar with yield from and asyncio I would not be able to understand this PEP, even though potentially one could use the new constructs without knowing anything about them.
I agree. I plan to update the PEP with some new semantics (prohibit passing coroutine-objects to iter(), tuple() and other builtins, as well as using them in 'for .. in coro()' loops). I'll add a section with a more detailed explanation of coroutine-objects. Best, Yury

On 25 April 2015 at 22:02, Yury Selivanov <yselivanov.ml@gmail.com> wrote: [...]
On 2015-04-25 4:47 PM, Arnaud Delobelle wrote: [...]
1. About the 'async for' construct. Each iteration will create a new coroutine object (the one returned by Cursor.__anext__()) and it seems to me that it can be wasteful. In the example given of an 'aiterable' Cursor class, probably a large number of rows will fill the cursor buffer in one call of cursor._prefetch(). However each row that is iterated over will involve the creation execution of a new coroutine object. It seems to me that what is desirable in that case is that all the buffered rows will be iterated over as in a plain for loop.
I agree that creating a new coroutine object is a little bit wasteful.
However, the proposed iteration protocol was designed to:
1. Resemble already existing __iter__/__next__/StopIteration protocol;
2. Pave the road to introduce coroutine-generators in the future.
Do you mean that __aiter__() would return a 'coroutine-generator'? I'm not sure what such an object is but if it is a suspendable generator in the same way that a coroutine is a suspendable function, then this is a strong argument to make the __aiter__() magic method a coroutine rather than a plain function. I.e. __aiter__() would return either an 'aiterator' or a 'coroutine generator object'. I think this could be mentioned in the section 'Why "__aiter__" is a coroutine' [1].
We could, in theory, design the protocol to make __anext__ awaitable return a regular iterators (and raise StopAsyncIteration at the end) to make things more efficient, but that would complicate the protocol tremendously, and make it very hard to program and debug.
My opinion is that this has to be addressed in 3.6 with coroutine-generators if there is enough interest from Python users.
True, but to me this is bound to happen. I feel like the semantics of __anext__() is tied to the behaviour of this yet to be defined coroutine generator object and that if it turns out that the natural bevaviour of a coroutine generator is not consistent with the semantics of __anext__() then it would be a shame. I must say I have no evidence that this will happen!
2. I think the semantics of the new coroutine objects could be defined more clearly in the PEP. Of course they are pretty obvious when you know that the coroutines are meant to replace asyncio.coroutine as described in [1]. I understand that this PEP is mostly for the benefit of asyncio, hence mainly of interest of people who know it. However I think it would be good for it to be more self-contained. I have often read a PEP as an introduction to a new feature of Python. I feel that if I was not familiar with yield from and asyncio I would not be able to understand this PEP, even though potentially one could use the new constructs without knowing anything about them.
I agree. I plan to update the PEP with some new semantics (prohibit passing coroutine-objects to iter(), tuple() and other builtins, as well as using them in 'for .. in coro()' loops). I'll add a section with a more detailed explanation of coroutine-objects.
Great! Thanks, -- Arnaud PS: there's a slight asymmetry in the terminology between coroutines and generators. 'Generator functions' are to 'generators' what 'coroutines' are to 'coroutine objects', which makes it difficult to what one is talking about when referring to a 'coroutine generator'. [1] https://www.python.org/dev/peps/pep-0492/#id52

Le 25 avr. 2015 23:02, "Yury Selivanov" <yselivanov.ml@gmail.com> a écrit :
I agree. I plan to update the PEP with some new semantics (prohibit passing coroutine-objects to iter(), tuple() and other builtins, as well as using them in 'for .. in coro()' loops). I'll add a section with a more detailed explanation of coroutine-objects.
Guido rejected the PEP 3152 because it disallow some use cases (create a coroutine and then wait for it). But careful of not introducing similar limitation in the PEP 492. There is an open issue to change how to check if an object is a coroutine: see issues 24004 and 24018. Well, the issue doesn't propose to remove checks (they are useful to detect common bugs), but to accept generators implemented in other types for Cython. I spend some time in asyncio to valdiate input type because it was to easy to misuse the API. Example: .call_soon(coro_func) was allowed whereas it's an obvious bug See asyncio tests and type checks for more cases. Victor
participants (19)
-
Andrew Svetlov
-
Antoine Pitrou
-
Arnaud Delobelle
-
Barry Warsaw
-
Ethan Furman
-
Greg
-
Greg Ewing
-
Guido van Rossum
-
Ludovic Gasc
-
MRAB
-
Paul Sokolovsky
-
PJ Eby
-
Rajiv Kumar
-
Stephen J. Turnbull
-
Steven D'Aprano
-
Victor Stinner
-
Wolfgang Langner
-
Yury Selivanov
-
Łukasz Langa