<br>Share as little as possible between your various processes - shared, mutable state is a parallelism tragedy.<br><br>If you can avoid sharing an entire dictionary, do so. It'd probably be better to dedicate one process to updating your dictionary, and then using a multiprocessing.Queue to pass delta records from your workers to your dictionary management process.<br>
<br>Also, I'm inclined to doubt it's going to work well to have multiple processes doing I/O on the same socket - you'd probably best have a process that does all the I/O on the socket, and then, again, have one or more multprocessing.Queue's that pass I/O results/requests around.<br>
<br><div class="gmail_quote">On Thu, May 19, 2011 at 6:10 AM, Pietro Abate <span dir="ltr"><<a href="mailto:Pietro.Abate@pps.jussieu.fr">Pietro.Abate@pps.jussieu.fr</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hi all,<br>
<br>
I'm a bit struggling to understand a KeyError raised by the multiprocessing library.<br>
<br>
My idea is pretty simple. I want to create a server that will spawn a number of<br>
workers that will share the same socket and handle requests independently. The<br>
goal is to build a 3-tier structure where all requests are handled via an http<br>
server and then dispatched to nodes sitting in a cluster and from nodes to<br>
workers via the multiprocessing managers...<br>
<br>
There is one public server, one node per machine and x number of workers on<br>
each machine depending on the number of cores... I know I can use a more<br>
sophisticated library, but for such a simple task (I'm just prototyping here) I<br>
would just use the multiprocessing library... Is this possible or I should<br>
explore directly other solutions ? I feel I'm very close to have something<br>
working here ...<br>
<br>
The problem with the code below is that if I run the server as<br>
`python server.py 1` , that is, using only one process, it works as expected.<br>
<br>
However if I spawn two processes (`python server.py 2`) listening for<br>
connections, I get a nasty error :<br>
<br>
$python client.py ping<br>
Traceback (most recent call last):<br>
File "client.py", line 24, in <module><br>
sys.exit(main(sys.argv))<br>
File "client.py", line 21, in main<br>
print m.solver(args[1])._getvalue()<br>
File "/usr/lib/python2.6/multiprocessing/managers.py", line 637, in temp<br>
authkey=self._authkey, exposed=exp<br>
File "/usr/lib/python2.6/multiprocessing/managers.py", line 894, in AutoProxy<br>
incref=incref)<br>
File "/usr/lib/python2.6/multiprocessing/managers.py", line 700, in __init__<br>
self._incref()<br>
File "/usr/lib/python2.6/multiprocessing/managers.py", line 750, in _incref<br>
dispatch(conn, None, 'incref', (self._id,))<br>
File "/usr/lib/python2.6/multiprocessing/managers.py", line 79, in dispatch<br>
raise convert_to_error(kind, result)<br>
multiprocessing.managers.RemoteError:<br>
---------------------------------------------------------------------------<br>
Traceback (most recent call last):<br>
File "/usr/lib/python2.6/multiprocessing/managers.py", line 181, in handle_request<br>
result = func(c, *args, **kwds)<br>
File "/usr/lib/python2.6/multiprocessing/managers.py", line 402, in incref<br>
self.id_to_refcount[ident] += 1<br>
KeyError: '7fb51084c518'<br>
---------------------------------------------------------------------------<br>
<br>
My understanding is that all processes share the same socket (from the<br>
Manager). When a client wants to connect, a new connection is created and<br>
server independently by that process. If you look at the server trace (using<br>
logging), it actually receives the connection, handles it, but fails to<br>
communicate back to the client.<br>
<br>
Can anybody shed some light for me and maybe propose a solution ?<br>
<br>
thanks<br>
pietro<br>
<br>
----------------------------------------<br>
<br>
Server :<br>
<br>
import sys<br>
from multiprocessing.managers import BaseManager, BaseProxy, Process<br>
<br>
def baz(aa) :<br>
l = []<br>
for i in range(3) :<br>
l.append(aa)<br>
return l<br>
<br>
class SolverManager(BaseManager): pass<br>
<br>
class MyProxy(BaseProxy): pass<br>
<br>
manager = SolverManager(address=('127.0.0.1', 50000), authkey='mpm')<br>
manager.register('solver', callable=baz, proxytype=MyProxy)<br>
<br>
def serve_forever(server):<br>
try :<br>
server.serve_forever()<br>
except KeyboardInterrupt:<br>
pass<br>
<br>
def runpool(n):<br>
server = manager.get_server()<br>
workers = []<br>
<br>
for i in range(int(n)):<br>
Process(target=serve_forever, args=(server,)).start()<br>
<br>
if __name__ == '__main__':<br>
runpool(sys.argv[1])<br>
<br>
<br>
Client :<br>
<br>
import sys<br>
from multiprocessing.managers import BaseManager, BaseProxy<br>
<br>
import multiprocessing, logging<br>
<br>
class SolverManager(BaseManager): pass<br>
<br>
class MyProxy(BaseProxy): pass<br>
<br>
def main(args) :<br>
SolverManager.register('solver')<br>
m = SolverManager(address=('127.0.0.1', 50000), authkey='mpm')<br>
m.connect()<br>
<br>
print m.solver(args[1])._getvalue()<br>
<br>
if __name__ == '__main__':<br>
sys.exit(main(sys.argv))<br>
<br>
<br>
also tried on stack overflow and the python list, but I didn't manage to come up<br>
with a working solution yet...<br>
<br>
--<br>
----<br>
<a href="http://en.wikipedia.org/wiki/Posting_style" target="_blank">http://en.wikipedia.org/wiki/Posting_style</a><br>
<font color="#888888">--<br>
<a href="http://mail.python.org/mailman/listinfo/python-list" target="_blank">http://mail.python.org/mailman/listinfo/python-list</a><br>
</font></blockquote></div><br>