multiprocessing module

Glazner yoavglazner at gmail.com
Fri Jan 1 02:26:19 EST 2010


On Dec 15 2009, 10:56 am, makobu <makobu.mwambir... at gmail.com> wrote:
> I have a function that makes two subprocess.Popen() calls on a file.
>
> I have 8 cores. I need 8 instances of that function running in
> parallel at any given time till all the files are worked on.
> Can the multiprocessing module do this? If so, whats the best method?
>
> A technical overview of how the multiprocessing module actually works
> would also be really helpful.
>
> regards,
> mak.

I guess you need the Map-Reduce pattern.
It appears like multiprocessing.Pool will do the trick.

def doSomething(f):
  pass

pool = Pool(8)# 8 processes 1 per core

files = ['pop.txt','looper.txt','foo.bar']
results = pool.map(doSomething,files) #this does all the job

got it?



More information about the Python-list mailing list