subprocess.Popen and multiprocessing fails to execute external program

Dave Angel d at davea.name
Thu Jan 10 06:56:20 CET 2013


On 01/09/2013 11:08 PM, Niklas Berliner wrote:
> I have a pipline that involves processing some data, handing the data to an
> external program (t_coffee used for sequence alignments in bioinformatics),
> and postprocessing the result. Since I have a lot of data, I need to run my
> pipeline in parallel which I implemented using the multiprocessing module
> following Doug Hellmanns blog (
> http://blog.doughellmann.com/2009/04/pymotw-multiprocessing-part-1.html).
>
> My pipeline works perfectly fine when I run it with the multiprocessing
> implementation and one consumer, i.e. on one core. If I increase the number
> of consumers, i.e. that multiple instances of my pipeline run in parallel
> the external program fails with a core dump.
>

Could it be that the external program is not designed to have multiple
simultaneous instances?  There are many such programs, some of which
check for an existing process before allowing another one to get far.

When using the multiprocessing module, always make sure your externals
are well-behaved before looking for problems in your multi-code.

To put it more strongly, a well-written program cannot easily be crashed
by the parent that launched it.


-- 

DaveA




More information about the Python-list mailing list