70% [* SPAM *] Re: multiprocessing.Queue blocks when sending large object
boB
boB
Mon Dec 5 03:57:01 EST 2011
On Mon, 5 Dec 2011 09:02:08 +0100, DPalao <dpalao.python at gmail.com>
wrote:
>El Martes Noviembre 29 2011, DPalao escribió:
>> Hello,
>> I'm trying to use multiprocessing to parallelize a code. There is a number
>> of tasks (usually 12) that can be run independently. Each task produces a
>> numpy array, and at the end, those arrays must be combined.
>> I implemented this using Queues (multiprocessing.Queue): one for input and
>> another for output.
>> But the code blocks. And it must be related to the size of the item I put
>> on the Queue: if I put a small array, the code works well; if the array is
>> realistically large (in my case if can vary from 160kB to 1MB), the code
>> blocks apparently forever.
>> I have tried this:
>> http://www.bryceboe.com/2011/01/28/the-python-multiprocessing-queue-and-lar
>> ge- objects/
>> but it didn't work (especifically I put a None sentinel at the end for each
>> worker).
>>
>> Before I change the implementation,
>> is there a way to bypass this problem with multiprocessing.Queue?
>> Should I post the code (or a sketchy version of it)?
>>
>> TIA,
>>
>> David
>
>Just for reference. The other day I found the explanation by "ryles" on
>his/her mail of 27th aug 2009, with title "Re: Q: multiprocessing.Queue size
>limitations or bug...". It is very clarifying.
>After having read that I arranged the program such that the main process did
>not need to know when the others have finished, so I changed the process join
>call with a queue get call, until a None (one per process) is returned.
>
>Best,
>
>David
Why do people add character like [* SPAM *] to their subject
lines ?? Is it supposed to do something ?? I figured since
programmers hang out here, maybe one of you know this.
Thanks,
boB
More information about the Python-list
mailing list