multiprocessing eats memory
istvan.albert at gmail.com
Sat Sep 27 05:28:32 CEST 2008
On Sep 26, 4:52 am, redbaron <ivanov.ma... at gmail.com> wrote:
> How could I avoid of storing them? I need something to check does it
> ready or not and retrieve results if ready. I couldn't see the way to
> achieve same result without storing asyncs set.
It all depends on what you are trying to do. The issue that you
originally brought up is that of memory consumption.
When processing data in parallel you will use up as much memory as
many datasets you are processing at any given time. If you need to
reduce memory use then you need to start fewer processes and use some
mechanism to distribute the work on them as they become free. (see
recommendation that uses Queues)
More information about the Python-list