global interpreter lock

Paul Rubin http
Sat Sep 10 13:59:18 CEST 2005


Michael Sparks <ms at cerenity.org> writes:
> > But I think to do it on Erlang's scale, Python needs user-level
> > microthreads and not just OS threads.  
> 
> You've just described Kamaelia* BTW, except substitute micro-thread
> with generator :-) (Also we call the queues outboxes and inboxes, and 
> the combination of a generator in a class with inboxes and outboxes
> components)
>    * http://kamaelia.sf.net/

I don't see how generators substitute for microthreads.  In your example
from another post:

   class encoder(component):
      def __init__(self, **args):
          self.encoder = unbreakable_encryption.encoder(**args)
      def main(self):
         while 1:
             if self.dataReady("inbox"):
                data = self.recv("inbox")
                encoded = self.encoder.encode(data)
                self.send(encoded, "outbox")
             yield 1

You've got the "main" method creating a generator that has its own
event loop that yields after each event it processes.  Notice the kludge

             if self.dataReady("inbox"):
                data = self.recv("inbox")

instead of just saying something like:

            data = self.get_an_event("inbox")

where .get_an_event "blocks" (i.e. yields) if no event is pending.
The reason for that is that Python generators aren't really coroutines
and you can't yield except from the top level function in the generator.

In that particular example, the yield is only at the end, so the
generator isn't doing anything that an ordinary function closure
couldn't:

       def main(self):
           def run_event():
               if self.dataReady("inbox"):
                  data = self.recv("inbox")
                  encoded = self.encoder.encode(data)
                  self.send(encoded, "outbox")
           return run_event

Now instead of calling .next on a generator every time you want to let
your microthread run, just call the run_event function that main has
returned.  However, I suppose there's times when you'd want to read an
event, do something with it, yield, read another event, and do
something different with it, before looping.  In that case you can use
yields in different parts of that state machine.  But it's not that
big a deal; you could just use multiple functions otherwise.

All in all, maybe I'm missing something but I don't see generators as
being that much help here.  With first-class continuations like
Stackless used to have, the story would be different, of course.




More information about the Python-list mailing list