generators shared among threads
Paul Rubin
http
Tue Mar 7 15:02:54 EST 2006
aleaxit at yahoo.com (Alex Martelli) writes:
> > g=itertools.count()
>
> I believe that in the current implementation you'd get "lucky", but
> there is no guarantee that such luck would persist across even a minor
> bugfix in the implementation. Don't do it.
I remember being told that xrange(sys.maxint) was thread-safe, but of
course I wouldn't want to depend on that across Python versions either.
> Queue.Queue is often the best way to organize cooperation among threads.
> Make a Queue.Queue with a reasonably small maximum size, a single
> dedicated thread that puts successive items of itertools.count onto it
> (implicitly blocking and waiting when the queue gets full),
This should be pretty simple to implement and not much can go wrong
with it, but it means a separate thread for each such generator, and
waiting for thread switches every time the queue goes empty. A more
traditional approach would be to use a lock in the generator,
def f():
lock = threading.Lock()
i = 0
while True:
lock.acquire()
yield i
i += 1
lock.release()
but it's easy to make mistakes when implementing things like that
(I'm not even totally confident that the above is correct).
Hmm (untested, like above):
class Synchronized:
def __init__(self, generator):
self.gen = generator
self.lock = threading.Lock()
def next(self):
self.lock.acquire()
try:
yield self.gen.next()
finally:
self.lock.release()
synchronized_counter = Synchronized(itertools.count())
That isn't a general solution but can be convenient (if I didn't mess
it up). Maybe there's a more general recipe somewhere.
More information about the Python-list
mailing list