reloading code and multiprocessing
andrea.crotti.0 at gmail.com
Fri Jul 27 13:01:03 CEST 2012
2012/7/25 andrea crotti <andrea.crotti.0 at gmail.com>:
> I would also like to avoid this in general, but we have many
> subprocesses to launch and some of them might take weeks, so we need
> to have a process which is always running, because there is never a
> point in time where we can just say let's stop everything and start again..
> Anyway if there are better solutions I'm still glad to hear them, but
> I would also like to keep it simple..
> Another thing which now we need to figure out is how to communicate
> with the live process.. For example we might want to submit something
> manually, which should pass from the main process.
> The first idea is to have a separate process that opens a socket and
> listens for data on a local port, with a defined protocol.
> Then the main process can parse these commands and run them.
> Are there easier ways otherwise?
So I was trying to do this, removing the module from sys.modules and
starting a new process (after modifying the file), but it doesn't work
as I expected.
The last assertion fails, but how?
The pyc file is not generated, the module is actually not in
sys.modules, and the function doesn't in the subprocess doesn't fail
but still returns the old value.
old_a = "def ret(): return 0"
new_a = "def ret(): return 1"
"""In this case the import is done before the process are started,
so we need to clean sys.modules to make sure we reload everything
queue = Queue()
open(path.join(CUR_DIR, 'old_a.py'), 'w').write(old_a)
p1 = Process(target=func_no_import, args=(queue, ))
open(path.join(CUR_DIR, 'old_a.py'), 'w').write(new_a)
p2 = Process(target=func_no_import, args=(queue, ))
More information about the Python-list