[Python-ideas] Cofunctions - Getting away from the iterator protocol

Ron Adam ron3200 at gmail.com
Thu Nov 3 07:15:30 CET 2011


On Thu, 2011-11-03 at 14:47 +1300, Greg Ewing wrote:
> On 03/11/11 05:21, Ron Adam wrote:
> 
> > Would it be possible to rewrite the 'yield' internals so they work in
> > the following way...
> >
> >       # a = yield b
> >       try:
> >            raise SuspendException(b, _self=_self)
> >       Except ContinueException as exc:
> >            a = exc.args
> 
> I've just been thinking about something like that, while pondering
> whether there is a middle ground somewhere between the yield-from
> mechanism and a completely new coroutine protocol.

I was thinking if we could substitute an alternative spelling like that,
then it gives us a bit more control on how to interact with other outer
frameworks.  If the internals can be done that way, then it may open up
more options in python generators.  So instead of a completely new
coroutine proticol, we have better tools for others to create their own
frameworks and proticols.

> The problem I had with building all of it on yield-from was that
> there was no way to distinguish a 'yield' being used for the purpose
> of suspending a coroutine from one being used to return a value
> to a next() call.

Right.  The obvious way is just to add a second of everything.
   next2()
   .send2()
   .throw2()
   yield2

I don't think that is the best way.


> However, if something other than 'yield' is used for coroutine
> suspension -- such as a 'coyield' keyword or coyield() function --
> then I think this problem becomes solvable. In a cogenerator
> (i.e. a generator running in coroutine mode), 'coyield' would
> do what 'yield' does in normal mode (simply suspend the frame),
> and 'yield(value)' would raise StopIteration(value).

Well it sounds reasonable,  but how would that actually work?  What if
the coroutine is paused at coyield, and you need to do a next rather
than a conext?  And also in the case of it being the othe way round.

> (If the latter seems unintuitive, I sympathise. It arises because
> we're effectively making a cocall to the __next__ method of the
> generator, and in the yield-from universe, the way a cocall
> returns a value is by raising StopIteration.)

> But now we need a different way for the cogenerator to signal
> that it has finished iterating! I think what will need to happen
> is that a cogenerator raises CoStopIteration instead of
> StopIteration when it falls off the end, and the cocaller of
> the cogenerator catches that and turns it into a normal
> StopIteration.
> 
> Confused enough yet? I had better finish my cosandwitch and get
> some more cocoffee before trying to think about this any more...

And that is the whole problem... trying to make this all un_coconfusing
to the average python programmer.  If it's coconfusing to us, they don't
have a chance.  ;-)

Hmm... I have a craving for some hot co_co now.

I've been poking around in genobject.c, frameobject.c, and ceval.c, to
try to get a handle on just how it all fits together.

One of the odd things is a throw() done before a generator is started
rasies an excption at he first line, where it has no chance to act on
it.  And it also doesn't propagate back out.  So the first thing I'm
trying (to help me learn the C code better/again) is to see if I can get
it to ignore an exception thrown in at that state.  And then see if I
can make that specific to just g.throw(ContinueException).

That way I don't have to pre_start the generators if I'm using
throw(ContinueException) as my scheduler interface.

It's a start. ;-)

Cheers,
   Ron





More information about the Python-ideas mailing list