[BangPypers] Ideas for Python concurrency...

vikas ruhil vikasruhil06 at gmail.com
Tue Feb 8 10:01:33 CET 2011

Result is not in order even if our tasks’ queue was in order. This is
because the program run in parallel.
–Queue.get() return the data to the worker and delete it.

also kept in mind while sharing memory

–Python provide two ways for the data to be stored in a shared memory map:
•Value :
–The return value is a synchronized wrapper for the object.
•Array :
–The return value is a synchronized wrapper for the array.

Distributed concurrency

Register the name of the object that the server is sharing.
–Connect to the server address.
–Call the name of the shared object

i think now your problem is solved if now also facing problem tell some more
about your problem

regards vikas ruhil

On Tue, Feb 8, 2011 at 2:27 PM, vikas ruhil <vikasruhil06 at gmail.com> wrote:

> use this type  Threading :
> •Thread(target=do_work,args=(work_queue,))
> –Multiprocessing:
> •Process(target=do_work,args=(work_queue,))
> On Tue, Feb 8, 2011 at 1:54 PM, Vishal <vsapre80 at gmail.com> wrote:
>> Hello,
>> This might sound crazy..and dont know if its even possible, but...
>> Is it possible that the Python process, creates copies of the interpreter
>> for each thread that is launched, and some how the thread is bound to its
>> own interpreter ?
>> This will increase the python process size...for sure, however data
>> sharing
>> will remain just like it is in threads.
>> and it "may" also allow the two threads to run in parallel, assuming the
>> processors of today can send independent instructions from the same
>> process
>> to multiple cores?
>> Comments, suggestions, brush offs  are welcome :))
>> Thanks and best regards,
>> Vishal Sapre
>> _______________________________________________
>> BangPypers mailing list
>> BangPypers at python.org
>> http://mail.python.org/mailman/listinfo/bangpypers

More information about the BangPypers mailing list