[Python-ideas] Async API
Yury Selivanov
yselivanov.ml at gmail.com
Thu Oct 25 20:39:18 CEST 2012
Guido,
Thank you for such a detailed and deep response. Lots of good thoughts
to digest.
One idea: the scope of the problem is enormously big. It may take
months/years to synchronize all ideas and thoughts by just communicating
ideas over mail list without a concrete thing and subject to discuss.
How about you/we create a repository with a draft implementation of
scheduler/io loop/coroutines engine and we simply start tweaking an
discussing that particular design? That way people will see where
to start the discussion, what's done, and some will even participate?
The goal is not to write a production-quality software, but rather to
have a common place to discuss/try things/benchmark etc. I'm not sure,
but maybe places like bitbucket, where you can have a wiki, issues, and
the actual code is a better place, than a mail-list.
I also think that there's need to move concurrency-related discussions
to a separate mail-list, as everything else on python-ideas is lost
now.
On 2012-10-25, at 1:58 PM, Guido van Rossum <guido at python.org> wrote:
[...]
>> - And what's your opinion on writing a PEP about making it possible
>> to pass a custom socket-factory to stdlib objects?
>
> That sounds like it might be jumping to a specific solution. I agree
> that the stdlib often, unfortunately, couples classes too tightly,
> where a class that needs an instance of another class just
> instantiates that other class rather than having an instance passed in
> (at least as an option). We're doing better with files these days --
> most APIs (that I can think of) that work with streams let you pass
> one in. So maybe you're on to something. Perhaps, as a step towards
> the exploration of this PEP, you could come up with a concrete list of
> modules and classes (or other API elements) that you think would
> benefit from being able to pass in a socket? Please start another
> thread -- python-ideas is fine. I will read it.
OK, I will, in a week or two. Need some time for a research.
[...]
> - For the task scheduler I am piling all my hopes on PEP-380, i.e.
> yield from. I have not found a single thing that is harder to do using
> this style than using the PEP-342 yield <future> style, and I really
> don't like mixing the two up (despite what Steve Dower says :-). But I
> don't want the event loop interface to know about this at all --
> howver the scheduler has to know about the event loop (at least its
> interface). I am currently refactoring my ideas in this area; I think
> I'll end up with a Task object that smells a bit like a Future, but
> represents a whole stack of generator invocations linked via
> yield-from, and which allows suspension of the entire stack at once;
> user code only needs to use Tasks when it wants to schedule multiple
> activities concurrently, not when it just wants to be able to yield.
> (This may be the core insight in favor of PEP 380.)
The only problem I have with PEP-380, is that to me it's not entirely
clear when you should use 'yield' or 'yield from' (please correct me if
I am wrong). I'll try to demonstrate it by example:
class Socket:
def sendall(self, payload):
f = Future()
IOLoop.sendall(payload, future=f)
return f
class SMTP:
def send(self, s):
...
# yield the returned future to the scheduler
yield self.sock.sendall(s)
...
# And later:
s = SMTP()
yield from s.send('spam')
Is it (roughly) how you want it all to look like? I.e. using 'yield' to
send a future/task to the scheduler, and 'yield from' to delegate?
If I guessed correctly, and that's how you envision it, I have a question:
What if you decide to refactor 'Socket.sendall' to be a coroutine?
In that case you'd want users to call it 'yield from Socket.sendall', and
not 'yield Socket.sendall'.
Thank you,
Yury
More information about the Python-ideas
mailing list