multiprocessing timing issue

Tim Roberts timr at
Thu Aug 11 07:26:35 CEST 2011

Tim Arnold <Tim.Arnold at> wrote:
>The task:
>I have a bunch of chapters that I want to gather data on individually 
>and then update a report database with the results.
>I'm using multiprocessing to do the data-gathering simultaneously.
>Each chapter report gets put on a Queue in their separate processes. 
>Then each report gets picked off the queue and the report database is 
>updated with the results.
>My problem is that sometimes the Queue is empty and I guess it's
>because the get_data() method takes a lot of time.
>I've used multiprocessing before, but never with a Queue like this.
>Any notes or suggestions are very welcome.

The obvious implication is that your timeout is simply not long enough for
your common cases.  If you know how many chapters to expect, why have a
timeout at all?  Why not just wait forever?
Tim Roberts, timr at
Providenza & Boekelheide, Inc.

More information about the Python-list mailing list