[Python-ideas] Pause (sort of a 'deep yield'?)

Adam Atlas adam at atlas.st
Mon Nov 12 05:52:43 CET 2007


Generator-based coroutines are great, but I've thought of some  
interesting cases where it would help to be able to sort of yield to  
an outer scope (beyond the parent scope) while being able to resume.  
I'm thinking this would make the most sense as a kind of exception,  
with an added "resume" method which would resume execution at the  
point at which the exception was raised. (They'd also have a throw()  
method for continuing execution but raising an exception, and a  
close() method, as with generators in Python >= 2.5.)

Here's an example to demonstrate what I'm talking about:

def a():
     print 'blah'
     p = pause 7 # like using `yield` as an expression
               # but it raises "PauseException" (or whatever)
     print p
     return (p, 123)

def b():
     return a()

try:
     print b()
except PauseException, e:
     print e.value
     e.reusme(3)

#prints:
#  blah
#  7
#  3
#  (3, 123)

Normally you'd subclass PauseException so you can catch specific known  
instances of pausing in your application. If no outer scope can handle  
a pause, then the program should exit as with any other exception.

For more practical use cases, I'm mainly thinking about asynchronous  
programming, things like Twisted; I see a lot of interesting  
possibilities there. But here's a simpler example... Suppose we have  
WSGI 2.0, and, as expected, it is rid of start_response() and the  
resulting write() callable. And suppose we want to write an adaptor to  
allow WSGI 1.0 applications to be used as WSGI 2.0 applications. We  
want to do this by creating a write() which pauses and sends the value  
to an outer wrapper which interleaves any write()en output with the  
WSGI 1.0 app's returned app_iter into a single generator. It would go  
something like this:

class StartRespPause (PauseException): pass
class WritePause (PauseException): pass
class wsgi_adaptor (object):
     def __init__(self, app):
         self.app = app

     def _write(self, data):
         pause WritePause(data)
         # Interrupts this frame and returns control to the first  
outer frame
         # that catches WritePause.

         # If the `pause` statement/expression is given a PauseException
         # instance, it raises that; if it is given a PauseException  
subclass,
         # it raises that with None; if it gets another value `v`, it  
raises
         # PauseException(v).

     def _start_response(self, status, response_headers, exc_info=None):
         # [...irrelevant exc_info handling stuff here...]
         pause (status, response_headers)
         return self._write

     def _app_iter(self, environ):
         try:
             for v in self.app(environ, self._start_response):
                 yield v
         except WritePause, e:
             yield e.value
             e.resume()
             # This part of the syntax is perhaps a little troublesome  
-- the
             # body of a `try` block might cause multiple pauses, so  
an `except`
             # block catching a PauseException subclass has the  
possibility of
             # running multiple times. This is the correct behaviour,  
but it is
             # somewhat counterintuitive given the huge precedent for  
at most
             # one `except` block to execute, once, for a given `try`  
block.
             # Perhaps there could be some syntax other than `except`,  
but of
             # course we'd rather keep the number of reserved words  
down.

     def __call__(self, environ):
         # [...whatever other bridging is needed...]
         try:
             app_iter = self.app_iter(environ)
         except StartRespPause, e:
             status, response_headers = e.value
             e.resume()
         return (status, response_headers, app_iter)

Thinking about environments like Twisted, it seems to me that this  
could make Deferreds/callbacks [almost?] entirely unnecessary. PEP 342  
(Coroutines via Enhanced Generators) speaks of using "a simple co- 
routine scheduler or 'trampoline function' [which] would let  
coroutines 'call' each other without blocking -- a tremendous boon for  
asynchronous applications", but I think pauses would simplify this  
even further; it would allow these matters to be mostly invisible  
outside the innermost potentially blocking functions. Basically, it  
"would let coroutines 'call' each other without blocking", but now  
without the quotes around the word 'call'. :)

The PEP gives the simple example of "data = (yield  
nonblocking_read(my_socket, nbytes))", but with pauses, we could  
forget about yields -- we'd be able to program almost exactly as with  
traditional blocking operations. "data = read(my_socket, nbytes)".  
Only potentially blocking functions would have to be concerned with  
pausing; read() would pause to an outer scheduler/trampoline/Twisted- 
type reactor, which, when data was available, would resume the paused  
read() function (giving it the data similarly to generator.send()),  
which would then return the value to the calling function exactly as a  
synchronous function would.



More information about the Python-ideas mailing list