Help with coroutine-based state machines?
tjreedy at udel.edu
Tue Jun 3 20:48:22 CEST 2003
"Alan Kennedy" <alanmk at hotmail.com> wrote in message
news:3EDC69DF.A46A97D4 at hotmail.com...
> Thanks to your clear examples, I now picture coroutines in the
> following way (which I hope is correct :-).
Some further comments, based on my understandings...
> 2. The only method by which the "traditional" function call can
> invoke another "traditional" function/method call is to call it
'Recursive' usually refers to a function calling itself, either
directly or via another function. Using 'recursive' for what are
normally considered non-recursive calls that create a call chain in
which each function is unique is a somewhat confusing expansion of its
> thus causing the construction of another stack frame.
> There is no mechanism for it to "substitute" the stack frame of the
> called function "in place of" its own stack frame. (Although I
> believe that Stackless can do this, after much rewriting and
> reinvention of truth :-).
> 3. Because all "traditional" function/method calls, involve the
> creation and eventual destruction of a stack frame, I label the
> in which they operate "Linear stack space".
Generators also involve the creation and eventual distruction of a
> 4. There is another space, which I call "Generator space".
What specifically do you consider to be part of this 'space'?
Candidates include generator functions, the generator objects they
produce (which follow the iterator protocal by having .__iter__ and
.next methods), the .next methods themselves, and the frames
associated with activated .next methods.
> When a call is made into "Generator space",
This implies that it contains the .next methods ...
> a stack frame is not constructed: at least not every time.
One is constructed on the first call to a generator .next() method.
On this call, control enters at the top like any other function. On
subsequent calls ...
> Instead, a resumable and persistent stack frame is "resumed":
> this "resumable stack frame" was created once, sometime in the past:
Very specifically, on the first call
> it is termed, in current python terminology, the generator-iterator.
Calling a generator function returns a generator object that is not
callable but which has a .next() method which is. The frame is
associated with the .next method.
> 5. When the code in "Generator space" (i.e. the generator-iterator)
> is called, it resumes immediately after where it last 'yield'ed, and
> continues until it 'yield's again, or returns or excepts. When it
> 'yield's, two things happen.
> A: The resumable stack frame is "frozen",
> so that it can later be resumed again. and
This is more something that does not happen - the usual destruction of
the stack frame. 'Freezing' implies that something special is done
that takes time and that would have to be reversed by 'thawing'.
Generators are efficient because yielding leaves the frame as it so
execution can be resumed exactly where left off.
> B: A python object, which may be None, is transferred back to the
As with normal functions.
> 6. For any call from "Linear Stack space" into "Generator space",
> there is a crossover point, P. When the called code in "Generator
> space" finishes executing, it can only enter back into "Linear stack
> space" through point P: it cannot exit through any other point.
> (According to the current python model).
I'm not sure that this is a helpful way to look at the process.. When
a gen-it.next frame is created or reactivated, it is tacked on to the
end of the linear frame stack just like any other execution frame. If
it calls another function, yet another frame is added. When it yields
or returns, it is removed from the chain, just like any other
function, and control passes back the to calling function, just like
any other function. The only differences are that 'removal' !=
'deletion' (because the gen object holds a reference to the frame, so
ref count does *not* go to 0) and 're-addition' != 'set execution
pointer to beginning of associated code-object'.
> 7. If any calls are made from "Generator space" into "Linear stack
> space", this leads to the creation of a stack frame, which must be
> destroyed. If the called function/method in "Linear stack space"
> calls back into "Generator space", and the "Linear space" function
> not allowed to exit naturally, this is essentially unbound
> which will lead eventually to a blown stack.
To me, this is unnecessarily confusing. For some reason, you see and
present generators as more special than they are. They are resumable
functions, and they are resumable mostly because yield does not
disable. I don't believe they do anything that cannot be done with
class methods. They are just easier to write (usually) and faster to
Indeed, I think it possibly useful for understanding generators to
think of a def statement with yield in the body as being something
like a syntactically condensed version of a class statement iterator
template with standard defs for __init__ and __iter__ and a
class-specific .next method that is filled in from the body of the
def. Calling a generator function returns an initialized instance
with the args strored away for later use by the .next method. Because
the whole package is a fairly common use case, it can be and now has
been optimized so the programmer need only write what is specific and
the interpreter adds less overhead than normal.
Terry J. Reedy
More information about the Python-list