[Python-Dev] [PEP 3148] futures - execute computations asynchronously
pje at telecommunity.com
Sun Mar 7 04:56:57 CET 2010
At 01:10 PM 3/7/2010 +1100, Brian Quinlan wrote:
>On 7 Mar 2010, at 03:04, Phillip J. Eby wrote:
>>At 05:32 AM 3/6/2010, Brian Quinlan wrote:
>>>Using twisted (or any other asynchronous I/O framework) forces you to
>>>rewrite your I/O code. Futures do not.
>>Twisted's "Deferred" API has nothing to do with I/O.
>I see, you just mean the API and not the underlying model.
>We discussed the Deferred API on the stdlib-sig and I don't think that
>anyone expressed a preference for it over the one described in the PEP.
>Do you have any concrete criticism?
Of the PEP, yes, absolutely, and I've already stated much of it. My
quibbles are with the PEP *itself*, not so much the API or implementation.
I think that said API and implementation is fine, but FAR too
narrowly scoped to claim to be "futures" or "execute computations
asynchronously", as the PEP calls it. It's really just a nice task
Now, if the PEP were *scoped* as such, i.e., "hey, let's just have a
nice multithread/multiprocess task queuing implementation in the
stdlib", I would be +1. It's a handy utility to have.
But I think that the scope given by the PEP appears overly ambitious
compared to what is actually being delivered; this seems less of a
"futures API" and more like a couple of utility functions for waiting
on threads and processes.
To rise to the level of an API, it seems to me that it would need to
address interop with coroutines and async frameworks, where the idea
of "futures" seems much more relevant than simple
synchronous-but-parallel scripts. (It should also have better tools
for working with futures asynchronously, because, hey, it says right
there in the title, "execute computations asynchronously".)
Anyway, I'd like to see the answers to (at *least*) the following
issues fleshed out in the PEP, if you want it to really be a "futures
API", vs. "nice task queue in the stdlib":
* Compare/contrast alternatives now available
* Address the issue of competing event loops and sharing/passing
executors among code
* Either offer a way for executed code to re-enter its own executor
(e.g. via an optional parameter), or explain why this was considered
* Address interoperability with coroutines and async frameworks, or
clearly explain why such is out of scope
(Personally, I think it would be better to just drop the ambitious
title and scope, and go for the "nice task queue" scope. I imagine,
too, that in that case Jean-Paul wouldn't need to worry about it
being raised as a future objection to Deferreds or some such getting
into the stdlib.)
More information about the Python-Dev