reloading code and multiprocessing
andrea.crotti.0 at gmail.com
Mon Jul 23 15:51:05 CEST 2012
2012/7/20 Chris Angelico <rosuav at gmail.com>:
> On Thu, Jul 19, 2012 at 8:15 PM, andrea crotti
> <andrea.crotti.0 at gmail.com> wrote:
>> We need to be able to reload code on a live system. This live system
>> has a daemon process always running but it runs many subprocesses with
>> multiprocessing, and the subprocesses might have a short life...
>> As long as I import the code in the function and make sure to remove the
>> "pyc" files everything seems to work..
>> Are there any possible problems which I'm not seeing in this approach or
>> it's safe?
> Python never promises reloading reliability, but from my understanding
> of what you've done here, it's probably safe. However, you may find
> that you're using the wrong language for the job; it depends on how
> expensive it is to spin off all those processes and ship their work to
> them. But if that's not an issue, I'd say you have something safe
> there. (Caveat: I've given this only a fairly cursory examination, and
> I'm not an expert. Others may have more to say. I just didn't want the
> resident Markov chainer to be the only one to respond!!)
Thanks Chris, always nice to get a "human" answer ;)
Anyway the only other problem which I found is that if I start the
subprocesses after many other things are initialised, it might happen
that the reloading doesn't work correctly, is that right?
Because sys.modules will get inherited from the subprocesses and it
will not reimport what has been already imported as far as I
So or I make sure I import everything only where it is needed or (and
maybe better and more explicit) I remove manually from sys.modules all
the modules that I want to reload, what do you think?
More information about the Python-list