[Python-ideas] solving multi-core Python

Chris Angelico rosuav at gmail.com
Sun Jun 21 16:12:24 CEST 2015

On Sun, Jun 21, 2015 at 9:41 PM, Sturla Molden <sturla.molden at gmail.com> wrote:
> However, from the perspective of multi-core parallel computing, I am not
> sure what this offers over using multiple processes.
> Yes, you avoid the process startup time, but on POSIX systems a fork is very
> fast. An certainly, forking is much more efficient than serializing Python
> objects. It then boils down to a workaround for the fact that Windows cannot
> fork, which makes it particularly bad for running CPython. You also have to
> start up a subinterpreter and a thread, which is not instantaneous. So I am
> not sure there is a lot to gain here over calling os.fork.

That's all very well for sending stuff *to* a subprocess. If you fork
for a single job, do the job, and have the subprocess send the result
directly back to the origin (eg over its socket), then terminate, then
sure, you don't need a lot of IPC. But for models where there's
ongoing work, maybe interacting with other subinterpreters
periodically, there could be a lot of benefit. It's very easy to slip
into a CGI style of mentality where requests are entirely fungible and
independent, and all you're doing is parallelization, but not
everything fits into that model :) I run a MUD server, for instance,
where currently every connection gets its own thread; if I wanted to
make use of multiple CPU cores, I would not want to have the
connections handled by separate processes, because they are constantly
interacting with each other, so IPC would get expensive.


More information about the Python-ideas mailing list