[Python-Dev] Cloning threading.py using proccesses

Josiah Carlson jcarlson at uci.edu
Wed Oct 11 18:46:39 CEST 2006


"Richard Oudkerk" <r.m.oudkerk at googlemail.com> wrote:
> On 10/10/06, Josiah Carlson <jcarlson at uci.edu> wrote:
> > > the really interesting thing here is a ready-made threading-style API, I
> > > think.  reimplementing queues, locks, and semaphores can be a reasonable
> > > amount of work; might as well use an existing implementation.
> >
> > Really, it is a matter of asking what kind of API is desireable.  Do we
> > want to have threading plus other stuff be the style of API that we want
> > to replicate?  Do we want to have shared queue objects, or would an
> > XML-RPC-esque remote.queue_put('queue_X', value) and
> > remote.queue_get('queue_X', blocking=1) be better?
> 
> Whatever the API is, I think it is useful if you can swap between
> threads and processes just by changing the import line.  That way you
> can write applications without deciding upfront which to use.

It would be convenient, yes, but the question isn't always 'threads or
processes?'  In my experience (not to say that it is more or better than
anyone else's), when going multi-process, the expense on some platforms
is significant enough to want to persist the process (this is counter to
my previous forking statement, but its all relative). And sometimes one
*wants* multiple threads running in a single process handling multiple
requests.

There's a recipe hanging out in the Python cookbook that adds a
threading mixin to the standard XML-RPC server in Python.  For a set of
processes (perhaps on different machines) that are cooperating and
calling amongst each other, I've not seen a significantly better variant,
especially when the remote procedure call can take a long time to
complete.  It does take a few tricks to make sure that sufficient
connections are available from process A to process B when A calls B
from multiple threads, but its not bad.


 - Josiah



More information about the Python-Dev mailing list