multiprocessing problems

Adam Tauno Williams awilliam at opengroupware.us
Tue Jan 19 17:26:49 CET 2010


> I decided to play around with the multiprocessing module, and I'm
> having some strange side effects that I can't explain.  It makes me
> wonder if I'm just overlooking something obvious or not.  Basically, I
> have a script parses through a lot of files doing search and replace
> on key strings inside the file.  I decided the split the work up on
> multiple processes on each processor core (4 total).  I've tried many
> various ways doing this form using pool to calling out separate
> processes, but the result has been the same: computer crashes from
> endless process spawn.

Are you hitting a ulimit error?  The number of processes you can create
is probably limited. 

TIP: close os.stdin on your subprocesses.

> Here's the guts of my latest incarnation.
> def ProcessBatch(files):
>     p = []
>     for file in files:
>         p.append(Process(target=ProcessFile,args=file))
>     for x in p:
>         x.start()
>     for x in p:
>         x.join()
>     p = []
>     return
> Now, the function calling ProcessBatch looks like this:
> def ReplaceIt(files):
>     processFiles = []
>     for replacefile in files:
>         if(CheckSkipFile(replacefile)):
>             processFiles.append(replacefile)
>             if(len(processFiles) == 4):
>                 ProcessBatch(processFiles)
>                 processFiles = []
>     #check for left over files once main loop is done and process them
>     if(len(processFiles) > 0):
>         ProcessBatch(processFiles)

According to this you will create files is sets of four, but an unknown
number of sets of four.




More information about the Python-list mailing list