[Python-ideas] The async API of the future: yield-from

Calvin Spealman ironfroggy at gmail.com
Mon Oct 15 12:25:16 CEST 2012


On Mon, Oct 15, 2012 at 3:37 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Guido van Rossum wrote:
>>
>> But, as Christian Tismer wrote, we need to have some kind of idea of
>> what the primitives are that we want to support.
>
>
> Well, I was just responding to your asking what the yield-from
> equivalent would be to the corresponding thing using Futures.
> I assumed from the fact that you asked that it was something
> Futures-using people like to do a lot, so it would be worth
> putting into a library.
>
> There may be other ways to approach it, though. Suppose we
> had a primitive that just waits for a single task to finish
> and returns its value. Then we could do this:
>
>    def par(*tasks):
>      for task in tasks:
>         scheduler.schedule(task)
>      return [yield from scheduler.wait_for(task) for task in tasks]
>
> That's straightforward enough that maybe it doesn't even need
> to be a library function, just a well-known pattern.

The more I follow this thread the less I understand the point of
introducing a new use for yield-from in this discussion.

All of this extra work trying to figure how to make yield-from work
giving its existing 3.3 semantics could just be avoided if we just
allow yielding the tasks directly, and treating them like any other
async operation.

In the original message yield-from seemed to be suggested, there
was no justification, it was just said "so you have to do this" but
I don't see that you do.

If you allow yielding tasks, then yielding multiple tasks to wait together
because trivial: just yield a tuple of them. In fact, I think we should say
that yielding any tuple of async operations, whatever those objects actually
end of being, should wait for all of them.

Maybe we also want to wait on both some http request operation,
implemented as a task (a generator), and also a cache hit.

def handle_or_cached(request):
    api_resp, cache_resp = yield request(API_ENDPOINT), cache.get(KEY)
    if cache_resp:
        return cache_resp
    return render(api_resp)

Or we could provide wrappers to control the behavior of multiple-wait:

def handle_or_cached(request):
    api_resp, cache_resp = yield first(request(API_ENDPOINT), cache.get(KEY))
    if cache_resp:
        return cache_resp
    return render(api_resp)


>> Maybe you meant condition variable? It looks like threading.Condition
>> with notify_all().
>
>
> Something like that -- the terminology probably varies a bit
> from one library to another. The basic concept is "set of
> tasks waiting for some condition to become true".
>
>
>> Anyway, I agree we need some primitives like these, but I'm not sure
>> how to choose the set of essentials.
>
>
> I think that most, maybe all, of the standard synchronisation
> mechanisms, like mutexes and semaphores, can be built out of the
> primitives I've already introduced -- essentially just block()
> and yield. So anything of this kind that we provide will be more
> in the way of convenience features than essential primitives.
>
> --
> Greg
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas



-- 
Read my blog! I depend on your acceptance of my opinion! I am interesting!
http://techblog.ironfroggy.com/
Follow me if you're into that sort of thing: http://www.twitter.com/ironfroggy



More information about the Python-ideas mailing list