Help with coroutine-based state machines?

Alan Kennedy alanmk at hotmail.com
Tue Jun 3 16:32:58 EDT 2003


[Terry]

> Some further comments, based on my understandings...

I completely agree with all that you said here. 

In defense of the original statements, I can only say that I
was trying to express the concepts in a way that made sense to me,
and that would hopefully make sense to others. A good example of
this is:

[Alan]
>> 2. The only method by which the "traditional" function call can
>> invoke another "traditional" function/method call is to call it
>> "recursively",

[Terry]
> 'Recursive' usually refers to a function calling itself, either
> directly or via another function.  Using 'recursive' for what are
> normally considered non-recursive calls that create a call chain in
> which each function is unique is a somewhat confusing expansion of its
> meaning.

That is why my original statements were liberally sprinkled with ""s.
I simply don't know how to concisely express the concept of a function
calling another function, resulting in the construction of a new stack
frame. I don't know the reason why we don't have a simple word like
"recursion" to describe this. Indeed, the word "recursion" itself only
implies a function calling itself, there is no concept of stack frames
at all. But I definitely used it inappropriately.

I suppose my description of it was also coloured by reading Christian
Tismer's description of removing the recursive calls to the C eval loop
in the interpreter, which is what happens each time a python function
is executed.

[Alan]
>> 4. There is another space, which I call "Generator space".

[Terry]
> What specifically do you consider to be part of this 'space'?

Whenever I think about this stuff, I find myself thinking more in terms
of what happens in the space, i.e. how the rules change in the space,
rather than objects that live in the space.

But on reading your other points, I see that I may be trying a little
too hard with the "Space" analogy.

> This is more something that does not happen - the usual destruction of
> the stack frame.  'Freezing' implies that something special is done
> that takes time and that would have to be reversed by 'thawing'.
> Generators are efficient because yielding leaves the frame as it so
> execution can be resumed exactly where left off.

You're right, I'm describing something that doesn't happen. I was
trying to verbally describe the concept of the stack frame staying
persistent and undamaged, ready for later re-entry.

[Terry]
> I'm not sure that this is a helpful way to look at the process..  When
> a gen-it.next frame is created or reactivated, it is tacked on to the
> end of the linear frame stack just like any other execution frame.  If
> it calls another function, yet another frame is added.  When it yields
> or returns, it is removed from the chain, just like any other
> function, and control passes back the to calling function, just like
> any other function.  The only differences are that 'removal' !=
> 'deletion' (because the gen object holds a reference to the frame, so
> ref count does *not* go to 0) and 're-addition' != 'set  execution
> pointer to beginning of associated code-object'.

These were the key statements that made me rethink the "space" analogy. The
change of behaviour I was really trying to describe (badly ;-) was really
the change in the behaviour of the top of the stack before, during and
after .next() calls. Seems like the top of stack has protruded into some
other "space" where the rules just aren't same ....... :-)

I think what originally put the "space" analogy into my head was the fact
that, by virtue of introducing a dispatcher function, the resumable stack
frame at the top of the call stack during a .next() call can be made to 
(rapidly) move around between a set of such resumable states. The increased
rapidity (compared to non-generator function calls) was an attribute of 
"generator space". 

And what also put the "space" idea in my mind was the fact that there could
be multiple connected resumable states, just "hanging there in space",
waiting to be resumed. Every running program has a set of zero or more of
these resumable states that the interpreter keeps somewhere special .....

[Alan]
[ Description of an attribute of generator space elided]

[Terry]
> To me, this is unnecessarily confusing.  For some reason, you see and
> present generators as more special than they are.  They are resumable
> functions, and they are resumable mostly because yield does not
> disable.  I don't believe they do anything that cannot be done with
> class methods.  They are just easier to write (usually) and faster to
> execute.

Absolutely. 

The only thing I would add is that I was trying to build a mental
picture of coroutines as well as generators. I don't think I explained
that properly.

> Indeed, I think it possibly useful for understanding generators to
> think of a def statement with yield in the body as being something
> like a syntactically condensed version of a  class statement iterator
> template with standard defs for __init__ and  __iter__ and a
> class-specific .next method that is filled in from the body of the
> def.  Calling a generator function returns an initialized instance
> with the args strored away for later use by the .next method.  Because
> the whole package is a fairly common use case, it can be and now has
> been optimized so the programmer need only write what is specific  and
> the interpreter adds less overhead than normal.

A much better model to use, many thanks.

regards,

-- 
alan kennedy
-----------------------------------------------------
check http headers here: http://xhaus.com/headers
email alan:              http://xhaus.com/mailto/alan




More information about the Python-list mailing list