[Python-ideas] The async API of the future: yield-from

Calvin Spealman ironfroggy at gmail.com
Mon Oct 15 14:31:22 CEST 2012


A thought about more ways we could control groups of tasks, and avoid
yield-from, just came to me this morning.

def asset_packer(asset_urls):
    with yield until_all as results:
        for url in asset_urls:
            yield http.get(url)
    return pack(results)

or

def handle_or_cached(url):
    with yield first as result:
        yield http.get(url)
        yield cache.get(url)
    return result

Currently, "with yield expr:" is not valid syntax, surprisingly. This gives us
room to use it for something new. A generator-sensitive context manager.

One option is just to allow the syntax directly. The generator yields, and
sent value is used as a context manager. This would let the generator
tell the scheduler "I'm going to give you a few different async ops, and I want
to wait for all of them before I continue." etc. However, it leaves open the
question how the scheduler knows the context manager has ended. Could it
somehow indicate this to the correct scheduler in __exit__?

Another option, if we're adding a new syntax anyway, is to make "with
yield expr:"
special and yield first the result of __enter__() and then, after the
block is done,
yield the result of __exit__(), which lets context blocks in the
generator talk to
the scheduler both before and after.

Maybe we don't need the second, nuttier idea. But, I like the general
idea. It feels
right.

On Mon, Oct 15, 2012 at 8:08 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On Mon, Oct 15, 2012 at 8:25 PM, Calvin Spealman <ironfroggy at gmail.com> wrote:
>> The more I follow this thread the less I understand the point of
>> introducing a new use for yield-from in this discussion.
>
> +1. To me, "yield from" is just a tool that brings generators back to
> parity with functions when it comes to breaking up a larger algorithm
> into smaller pieces. Where you would break a function out into
> subfunctions and call them normally, with a generator you can break
> out subgenerators and invoke them with yield from.
>
> Any meaningful use of "yield from" in the coroutine context *has* to
> ultimate devolve to an operation that:
> 1. Asks the scheduler to schedule another operation
> 2. Waits for that operation to complete
>
> Guido's approach to that problem is that step 1 is handled by calling
> functions that in turn call methods on a thread-local scheduler. These
> methods return Future objects, which can subsequently be yielded to
> the scheduler to say "I'm waiting for this future to be set".
>
> I *thought* Greg's way combined step 1 and step 2 into a single
> operation: the objects you yield *not only* say what you want to wait
> for, but also what you want to do. However, his example par()
> implementation killed that idea, since it turned out to need to
> schedule tasks explicitly rather than their being a "execute this in
> parallel" option.
>
> So now I'm back to think that Greg and Guido are talking about
> different levels. *Any* scheduling option will be able to be collapsed
> into an async task invoked by "yield from" by writing:
>
>     def simple_async_task():
>         return yield start_task()
>
> The part that still needs to be figured out is how you turn that
> suspend/resume communications channel between the lowest level of the
> task stack and the scheduling loop into something usable, as well as
> how you handle iteration in a sensible way (I described my preferred
> approach when writing about the API I'd like to see for an async
> version of as_completed). I haven't seen anything to suggest that
> "yield from"'s role should change from what it is in 3.3: a way to
> factor out generators into multiple pieces with out breaking send()
> and throw().
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
hink there is something wrong with the autolists that are set up to
include Premium and Free content.


-- 
Read my blog! I depend on your acceptance of my opinion! I am interesting!
http://techblog.ironfroggy.com/
Follow me if you're into that sort of thing: http://www.twitter.com/ironfroggy



More information about the Python-ideas mailing list