Sharing objects between processes

ET phu at 2drpg.org
Mon Mar 9 22:21:47 CET 2009


On Mon, 2009-03-09 at 13:58 -0700, Aaron Brady wrote:
> On Mar 9, 2:17 pm, ET <p... at 2drpg.org> wrote:
> > On Mon, 2009-03-09 at 11:04 -0700, Aaron Brady wrote:
> > > On Mar 9, 12:47 pm, ET <p... at 2drpg.org> wrote:
> > > > > Message: 2
> > > > > Date: Sun, 8 Mar 2009 12:00:40 -0700 (PDT)
> > > > > From: Aaron Brady <castiro... at gmail.com>
> > > > > Subject: Re: Sharing objects between processes
> > > > > To: python-l... at python.org
> > > > > Message-ID:
> > > > >    <5514c3df-d74e-47d8-93fc-34dd5119e... at c11g2000yqj.googlegroups.com>
> > > > > Content-Type: text/plain; charset=ISO-8859-1
> >
> > > > > On Mar 8, 1:36?pm, ET <p... at 2drpg.org> wrote:
> > > > > > I have been using the 'threading' library and decided to try swapping it
> > > > > > out for 'processing'... while it's awesome that processing so closely
> > > > > > mirrors the threading interface, I've been having trouble getting my
> > > > > > processes to share an object in a similar way.
> >
> > > > > > Using the 'with' keyword didn't work, and using normal locks doesn't
> > > > > > result in the expected behavior (I can get an object to be accessible in
> > > > > > more than one process, and Python indicates that the instances are
> > > > > > living at the same address in memory, but changes in one process are not
> > > > > > reflected in the other[s]). ?I'm sure this is because my expectations
> > > > > > are incorrect. :)
> >
> > > > > > The problem, as briefly as possible:
> > > > > > I have three processes which need to safely read and update two objects.
> >
> > > > > > I've been working with processing, multiprocessing, and parallel python,
> > > > > > trying to get this working... I suspect it can be accomplished with
> > > > > > managers and/or queues, but if there's an elegant way to handle it, I
> > > > > > have thus far failed to understand it.
> >
> > > > > > I don't particularly care which library I use; if someone has done this
> > > > > > or can recommend a good method they're aware of, it'd be incredibly
> > > > > > helpful.
> >
> > > > > > Thank you!
> >
> > > > > There is POSH: Python Object Sharing, which I learned about a while
> > > > > ago, but never used much.
> >
> > > > >http://poshmodule.sourceforge.net/
> >
> > > > > It's UNIX only.
> >
> > > > Thanks, I'll definitely keep that link handy... unfortunately, this
> > > > particular project needs to run on Windows as well as Linux-based
> > > > systems.
> >
> > > I don't recall whether there was anything in the source that's a deal-
> > > breaker on Windows.  The source is open, as you could see.
> >
> > > Other possibilities are 'shelve' and any database... fixed-length
> > > pickles, a directory of pickles, etc.  Maybe 'multiprocessing' would
> > > work for your synchronization, while you use a more custom technique
> > > for data exchange.
> >
> > > The only other thing I can do is bring to your attention an idea of
> > > mine for sharing primitives.  It's in the drawing board stage if you
> > > want to help.
> >
> > > Of course, it's only after you turned down 'multiprocessing' and
> > > 'POSH'.  It does things they don't and vice versa.
> > > --
> > >http://mail.python.org/mailman/listinfo/python-list
> >
> > I assumed it wouldn't work in Windows as you mentioned it was UNIX-only;
> > the readme also states that it's for POSIX systems only.
> >
> > I'd be more than happy to use multiprocessing; I've attempted to do so.
> > My question is largely how to implement it, as I have not managed to get
> > it working despite several attempts from different angles.
> >
> > Unfortunately, I do need to handle more than primitives, otherwise I'd
> > have attempted to use the shared ctypes present in at least one of
> > processing/multiprocessing/parallel python.
> 
> Here's what we have to work with from you:
> 
> > > > > > The problem, as briefly as possible:
> > > > > > I have three processes which need to safely read and update two objects.
> 
> Can they be subprocesses of eachother?  That is, can one master spawn
> the others as you desire?  Can you have one process running, and
> connect to it with sockets, pipes, (mailslots,) etc., and just give
> and get the information to it?  Then, synch. is a lot easier.
> 
> Do you need MROW multiple-reader one-writer synchro., or can they all
> go one at a time?  Is deadlock a concern?  Can you use OBL(OE) one big
> lock over everything, or do you need individual locks on elements of
> the data structure?
> 
> Can you use fire-and-forget access, or do you need return values from
> your calls?  Do you need to wait for completion of anything?
> 
> 'xmlrpc': remote procedure calls might pertain.
> --
> http://mail.python.org/mailman/listinfo/python-list

My intention is to have one object which is updated by two threads, and
read by a third... one updating thread takes user input, and the other
talks to a remote server.  Which are children of the others really
doesn't matter to me; if there's a way to handle that to make this work
more efficiently, excellent.

Yes, I can use a single lock over the entire object; finer control isn't
necessary.  I also don't need to avoid blocking on reads; reading during
a write is something I also want to avoid.

I was hoping to end up with something simple and readable (hence trying
'with' and then regular locks first), but if pipes are the best way to
do this, I'll have to start figuring those out.  If I have to resort to
XMLRPC I'll probably ditch the process idea and leave it at threads,
though I'd rather avoid this.

Is there, perhaps, a sensible way to apply queues and/or managers to
this?  Namespaces also seemed promising, but having no experience with
these things, the doc did not get me far.




More information about the Python-list mailing list