problem in implementing multiprocessing

gopal mishra gopalm at
Mon Jan 19 06:50:07 CET 2009

i know this is not an io - bound problem, i am creating heavy objects in the
process and add these objects in to queue and get that object in my main
program using queue.
you can test the this sample code
import time
from multiprocessing import Process, Queue

class Data(object):
    def __init__(self):
        self.y = range(1, 1000000)

def getdata(queue):
    data = Data()

if __name__=='__main__':
    t1 = time.time()
    d1 = Data()
    d2 = Data()
    t2 = time.time()
    print "without multiProcessing total time:", t2-t1
    queue = Queue()
    Process(target= getdata, args=(queue, )).start()
    Process(target= getdata, args=(queue, )).start()
    s1 = queue.get()
    s2 = queue.get()
    t2 = time.time()
    print "multiProcessing total time::", t2-t1

-----Original Message-----
From: James Mills [mailto:prologic at] 
Sent: Saturday, January 17, 2009 10:37 AM
To: gopal mishra
Cc: python-list at
Subject: Re: problem in implementing multiprocessing

On Fri, Jan 16, 2009 at 7:16 PM, gopal mishra <gopalm at> wrote:
> I create two heavy objects sequentially without using multipleProcessing
> then creation of the objects takes 2.5 sec.if i create these two objects
> separate process then total time is 6.4 sec.
> i am thinking it is happening due to the pickling and unpickling of the
> objects.if i am right then what could be the sollution.
> my system configuration:
> dual-core processor
> winXP
> python2.6.1

System specs in this case are irrelevant.

What you are experiencing is most likely an I/O
bound problem - using multiprocessing may likely
not help you solve the problem any faster because of
your I/O constraint.


More information about the Python-list mailing list