[Python-Dev] PEP 380 (yield from a subgenerator) comments
P.J. Eby
pje at telecommunity.com
Wed Mar 25 15:26:27 CET 2009
At 06:03 PM 3/25/2009 +1200, Greg Ewing wrote:
>I wanted a way of writing suspendable functions that
>can call each other easily. (You may remember I
>originally wanted to call it "call".) Then I noticed
>that it would also happen to provide the functionality
>of earlier "yield from" suggestions, so I adopted that
>name.
I still don't see what you gain from making this syntax, vs. putting
something like this in the stdlib (rough sketch):
class Task(object):
def __init__(self, geniter):
self.stack = [geniter]
def __iter__(self):
return self
def send(self, value=None):
if not self.stack:
raise RuntimeError("Can't resume completed task")
return self._step(value)
send = next
def _step(self, value=None, exc_info=()):
while self.stack:
try:
it = self.stack[-1]
if exc_info:
try:
rv = it.throw(*exc_info)
finally:
exc_info = ()
elif value is not None:
rv = it.send(value)
else:
rv = it.next()
except:
value = None
exc_info = sys.exc_info()
if exc_info[0] is StopIteration:
exc_info = () # not really an error
self.pop()
else:
value, exc_info = yield_to(rv, self)
else:
if exc_info:
raise exc_info[0], exc_info[1], exc_info[2]
else:
return value
def throw(self, *exc_info):
if not self.stack:
raise RuntimeError("Can't resume completed task")
return self._step(None, exc_info)
def push(self, geniter):
self.stack.append(geniter)
return None, ()
def pop(self, value=None):
if self.stack:
it = self.stack.pop()
if hasattr(it, 'close'):
try:
it.close()
except:
return None, sys.exc_info()
return value, ()
@classmethod
def factory(cls, func):
def decorated(*args, **kw):
return cls(func(*args, **kw))
return decorated
def yield_to(rv, task):
# This could/should be a generic function, to allow yielding to
# deferreds, sockets, timers, and other custom objects
if hasattr(rv, 'next'):
return task.push(rv)
elif isinstance(rv, Return):
return task.pop(rv.value)
else:
return rv, ()
class Return(object):
def __init__(self, value=None):
self.value = value
@Task.factory
def sample_task(arg1, another_arg):
# blah blah
something = (yield subtask(...))
yield Return(result)
def subtask(...):
...
yield Return(myvalue)
The trampoline (the _step() method) handles the co-operative aspects,
and modifying the yield_to() function allows you to define how
yielded values are processed. By default, they're sent back into the
generator that yields them, but you can pass a Return() to terminate
the generator and pass the value up to the calling
generator. Yielding another generator, on the other hand, "calls"
that generator within the current task, and the same rules apply.
Is there some reason why this won't do what you want, and can't be
modified to do so? If so, that should be part of the PEP, as IMO it
otherwise lacks motivation for a language feature vs. say, a stdlib
module. If 'yield_to' is a generic function or at least supports
registration of some kind, a feature like this would be interoperable
with a wide variety of frameworks -- you could register deferreds and
delayed calls and IO objects from Twisted, for example. So it's not
like the feature would be creating an entire new framework of its
own. Rather, it'd be a front-end to whatever framework (or no
framework) you're using.
More information about the Python-Dev
mailing list