[Python-ideas] Async API: some more code to review

Steve Dower Steve.Dower at microsoft.com
Tue Oct 30 03:06:37 CET 2012

Possibly I should have selected a different code name, now I come to think of it, but we came up with such similar code that I don't think it'll stay separate for too long.

From: Python-ideas [python-ideas-bounces+steve.dower=microsoft.com at python.org] on behalf of Steve Dower [Steve.Dower at microsoft.com]
Sent: Monday, October 29, 2012 6:40 PM
To: python-ideas at python.org
Subject: [Python-ideas] Async API: some more code to review

To save people scrolling to get to the interesting parts, I'll lead with the links:

Detailed write-up: https://bitbucket.org/stevedower/tulip/wiki/Proposal

Source code: https://bitbucket.org/stevedower/tulip/src

(Yes, I renamed my repo after the code name was selected. That would have been far too much of a coincidence.)

Practically all of the details are in the write-up linked first, so anything that's not is either something I didn't think of or something I decided is unimportant right now (for example, the optimal way to wait for ten thousand sockets simultaneously on every different platform).

There's a reimplemented Future class in the code which is not essential, but it is drastically simplified from concurrent.futures.Future (CFF). It can't be directly replaced by CFF, but only because CFF requires more state management that the rest of the implementation does not perform ("set_running_or_notify_cancel"). CFF also includes cancellation, for which I've proposed a different mechanism.

For the sake of a quick example, I've modified Guido's main.doit function (http://code.google.com/p/tulip/source/browse/main.py) to how it could be written with my proposal (apologies if I've butchered it, but I think it should behave the same):


def doit():
    TIMEOUT = 2
    cs = CancellationSource()

    tasks = set()

    task1 = urlfetch('localhost', 8080, path='/', cancel_source=cs)

    task2 = urlfetch('', 8080, path='/home', cancel_source=cs)

    task3 = urlfetch('python.org', 80, path='/', cancel_source=cs)

    task4 = urlfetch('xkcd.com', ssl=True, path='/', af=socket.AF_INET, cancel_source=cs)

    ## for t in tasks: t.start()    # tasks start as soon as they are called - this function does not exist

    yield delay(0.2)                # I believe this is equivalent to scheduling.with_timeout(0.2, ...)?

    winners = [t.result() for t in tasks if t.done()]
    print('And the winners are:', [w for w in winners])

    results = []                    # This 'wait all' loop could easily be a helper function
    for t in tasks:                 # Unfortunately, [(yield t) for t in tasks] does not work :(
        results.append((yield t))
    print('And the players were:', [r for r in results])
    return results

This is untested code, and has a few differences. I don't have task names, so it will print the returned value from urlfetch (a tuple of (host, port, path, status, len(data), time_taken)). The cancellation approach is quite different, but IMO far more likely to avoid the finally-related issues discussed in other threads.

However, I want to emphasise that unless you are already familiar with this exact style, it is near impossible to guess exactly what is going on from this little sample. Please read the write-up before assuming what is or is not possible with this approach.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20121030/619896c6/attachment.html>

More information about the Python-ideas mailing list