[Multiprocessing-sig] stop and restart processes

Shawn Gong SGONG at mdacorporation.com
Mon Aug 29 22:34:16 CEST 2011


To clarify: I am using python 2.6.6, not 2.7
So 'maxtasksperchild' is not available. Is there something similar?

thanks,
Shawn


-----Original Message-----
From: multiprocessing-sig-bounces+sgong=mdacorporation.com at python.org [mailto:multiprocessing-sig-bounces+sgong=mdacorporation.com at python.org] On Behalf Of Shawn Gong
Sent: Monday, August 29, 2011 3:40 PM
To: 'multiprocessing-sig at python.org'
Subject: [Multiprocessing-sig] stop and restart processes

Hi list,

My Linux server has 24 CPUs. When I multiprocess a large job with more than 800 runs (para_list has >800 entries), I run out memory. There could be a memory leak somewhere.

Is there a way to stop after running 10 processes on one CPU and restart, so that memory leak can be avoided?

thanks,
Shawn

Codes:
    args = [(arg1, arg2, ...) for arg1, arg2 in para_list]

    # run My_calculation asynchronously
    pool = multiprocessing.Pool(processes = cpu_used)
    results = [pool.apply_async(My_calculation, a) for a in args]
    
    # get the results: waits for each to finish and logs any errors
    for idx, r in enumerate(results):
        try:
            r.get()
        except:
            log.debug(traceback.format_exc())


_______________________________________________
Multiprocessing-sig mailing list
Multiprocessing-sig at python.org
http://mail.python.org/mailman/listinfo/multiprocessing-sig


More information about the Multiprocessing-sig mailing list