multiprocessing timing issue

Tim Arnold Tim.Arnold at sas.com
Tue Aug 9 19:07:01 CEST 2011


Hi, I'm having problems with an empty Queue using multiprocessing.

The task:
I have a bunch of chapters that I want to gather data on individually 
and then update a report database with the results.
I'm using multiprocessing to do the data-gathering simultaneously.

Each chapter report gets put on a Queue in their separate processes. 
Then each report gets picked off the queue and the report database is 
updated with the results.

My problem is that sometimes the Queue is empty and I guess it's
because the get_data() method takes a lot of time.

I've used multiprocessing before, but never with a Queue like this.
Any notes or suggestions are very welcome.

The task starts off with:
Reporter(chapters).report()

thanks,
--Tim Arnold

from Queue import Empty
from multiprocessing import Process, Queue

def run_mp(objects,fn):
     q = Queue()
     procs = dict()
     for obj in objects:
         procs[obj['name']] = Process(target=fn, args=(obj,q))
         procs[obj['name']].start()

     return q

class Reporter(object):
     def __init__(self, chapters):
         self.chapters = chapters

     def report(self):
         q = run_mp(self.chapters, self.get_data)

         for i in range(len(self.chapters)):
             try:
                 data = q.get(timeout=30)
             except Empty:
                 print 'Report queue empty at %s' % (i)
             else:
                 self.update_report(data)

     def get_data(self, chapter, q):
         data = expensive_calculations()
         q.put(data)

     def update_report(self, data):
         db connection, etc.



More information about the Python-list mailing list