Python threading?

Thomas Heller theller at
Thu Sep 26 19:01:09 CEST 2002

Tim Peters wrote:
> [Robert Oschler]
>>Having written _many_ state machines in my life, which async programming
>>boils down to in one form or another, I feel it depends on the task.
>>Contrast this with a simple procedure call where a linear series of
>>function calls are made, each with appropriate error handling and status
>>adjustment, in a pleasant centrally located manner.
> [Thomas Heller]
>>I wonder if generators can come to a rescue here?
> Absolutely.  Read the start of PEP 255.
>>Still trying to get them into my head...
> Think of them as resumable functions.  You're familiar with functions that
> remember their own data state across calls, via abusing globals, or via
> declaring private static vrbls, in C.  Methods on objects often do the same
> by stuffing their state into instance attributes between calls.  A generator
> remembers its data state across resumptions simply by virtue of that its
> local variables don't vanish, and also remembers its control-flow state
> across resumptions.  Remembering local data state by magic is a real
> convenience, but can be simulated in many other ways with reasonable effort;
> it's the ability to remember and restore control-flow state by magic that
> can make generators more than merely convenient.

I _have_ written kind of hardware descriptions with it - what you do 
with VHDL testbenches normally. This was fun!
There were several generators like this, manipulating clock, reset
and other signals in a counter device. The value yield'ed is simply
the time to wait:

     def run_clock(self):
         while 1:
             self.clock = not self.clock
             yield 50 # wait 50 ns

     def run_reset(self):
         self.reset = 1
         yield 200 # a 200 ns reset pulse
         self.reset = 0

The dispatcher goes like this: Creates the generators,
pushes them onto a priority queue (together with the time
value they will have to be served), serves them by calling
the next() method, and pushes the result back onto the queue:

def run(*gens):
     queue = MyList() # a priority queue
     # insert all generators
     for g in gens:
         queue.put((0, g()))
     # and run them
     while queue:
         now, gen = queue.pop(0)
             dt =
         except StopIteration:
             queue.put((now + dt, gen))


OTOH, I've been missing the possibility to pass
values back _to_ the generators - something like maybe
'work = yield event', probably triggered by calling 
Wouldn't this be useful?

I fear I cannot explain this clearly, but I think
David Mertz statemachine example would also profit from it.
As I understand it, he passes this 'work' around in global variables.


More information about the Python-list mailing list