[Python-Dev] Addition of "pyprocessing" module to standard lib.
Tom Pinckney
thomaspinckney3 at gmail.com
Wed May 14 19:35:11 CEST 2008
On May 14, 2008, at 12:32 PM, Andrew McNabb wrote:
>>
>
> Think of the processing module as an alternative to the threading
> module, not as an alternative to MPI. In Python, multi-threading
> can be
> extremely slow. The processing module gives you a way to convert from
> using multiple threads to using multiple processes.
>
> If it made people feel better, maybe it should be called threading2
> instead of multiprocessing. The word "processing" seems to make
> people
> think of parallel processing and clusters, which is missing the point.
>
> Anyway, I would love to see the processing module included in the
> standard library.
>
Is the goal of the pyprocessing module to be exactly drop in
compatible with threading, Queue and friends? I guess the idea would
be that if my problem is compute bound I'd use pyprocessing and if it
was I/O bound I might just use the existing threading library?
Can I write a program using only threading and Queue interfaces for
inter-thread communication and just change my import statements and
have my program work? Currently, it looks like the pyprocessing.Queue
interface is slightly different than than Queue.Queue for example (no
task_done() etc).
Perhaps a stdlib version of pyprocessing could be simplified down to
not try to be a cross-machine computing environment and just be a same-
machine threading replacement? This would make the maintenance easier
and reduce confusion with users about how they should do cross-machine
multiprocessing.
By the way, a thread-safe simple dict in the style of Queue would be
extremely helpful in writing multi-threaded programs, whether using
the threading or pyprocessing modules.
More information about the Python-Dev
mailing list