multiprocessing timing issue
Tim.Arnold at sas.com
Tue Aug 9 19:07:01 CEST 2011
Hi, I'm having problems with an empty Queue using multiprocessing.
I have a bunch of chapters that I want to gather data on individually
and then update a report database with the results.
I'm using multiprocessing to do the data-gathering simultaneously.
Each chapter report gets put on a Queue in their separate processes.
Then each report gets picked off the queue and the report database is
updated with the results.
My problem is that sometimes the Queue is empty and I guess it's
because the get_data() method takes a lot of time.
I've used multiprocessing before, but never with a Queue like this.
Any notes or suggestions are very welcome.
The task starts off with:
from Queue import Empty
from multiprocessing import Process, Queue
q = Queue()
procs = dict()
for obj in objects:
procs[obj['name']] = Process(target=fn, args=(obj,q))
def __init__(self, chapters):
self.chapters = chapters
q = run_mp(self.chapters, self.get_data)
for i in range(len(self.chapters)):
data = q.get(timeout=30)
print 'Report queue empty at %s' % (i)
def get_data(self, chapter, q):
data = expensive_calculations()
def update_report(self, data):
db connection, etc.
More information about the Python-list