generators shared among threads

Alex Martelli aleaxit at
Sat Mar 4 20:10:50 CET 2006

<jess.austin at> wrote:

> hi,
> This seems like a difficult question to answer through testing, so I'm
> hoping that someone will just know...  Suppose I have the following
> generator, g:
> def f()
>     i = 0
>     while True:
>         yield i
>         i += 1
> g=f()
> If I pass g around to various threads and I want them to always be
> yielded a unique value, will I have a race condition?  That is, is it

Yes, you will.

> before resuming the first thread?  If so, would I get different
> behavior if I just set g like:
> g=itertools.count()

I believe that in the current implementation you'd get "lucky", but
there is no guarantee that such luck would persist across even a minor
bugfix in the implementation.  Don't do it.
> If both of these idioms will give me a race condition, how might I go
> about preventing such?  I thought about using threading.Lock, but I'm
> sure that I don't want to put a lock around the yield statement.

Queue.Queue is often the best way to organize cooperation among threads.
Make a Queue.Queue with a reasonably small maximum size, a single
dedicated thread that puts successive items of itertools.count onto it
(implicitly blocking and waiting when the queue gets full), and any
other thread can call get on the queue and obtain a unique item
(implicitly waiting a little bit if the queue ever gets empty, until the
dedicated thread waits and fills the queue again).  [[Alternatively you
could subclass Queue and override the hook-method _get, which always
gets called in a properly locked and thus serialized condition; but that
may be considered a reasonably advanced task, since such subclassing
isn't documented in the reference library, only in Queue's sources]].


More information about the Python-list mailing list