On 11/06/2020 2:50 pm, Riccardo Ghetta wrote: On 11/06/2020 12:59, Mark Shannon wrote:
If the additional resource consumption is irrelevant, what's the objection to spinning up a new processes? The additional resource consumption of a new python interpreter is irrelevant, but the process as a whole needs a lot of extra data making a new process rather costly.
Plus there are issues of licensing, synchronization and load balancing that are much easier to resolve (for our system, at least) with threads than processes. Would this prevent CPython starting new processes, or is this just for
Starting a new process is cheap. On my machine, starting a new Python process takes under 1ms and uses a few Mbytes. Sorry, I wasn't clear here. I was talking about starting one of our server processes, /with python embedded/. Since python routines are called by our C++ code and need to call other C++ routines, it cannot work alone and is surrounded by a lot of data needed for the C++ part. A python interpreter by itself would be like a cpu chip for someone needing a server. A critical component, sure, but only a small part of the whole. processes managed by your application? Is only for application processes, but because python is always embedded
On 12/06/2020 10:45, Mark Shannon wrote: there is little practical difference. I hope to not come out arrogant or dismissive, but can we take it from granted that multiprocessing is not a viable solution for our application, or at least that it would be impractical and too expensive rebuilding it from scratch to change paradigm ? At the same time, I realize that ours is a somewhat niche case and it may not be deemed interesting for python evolution. I just wanted to present a real world example of someone using python today and who would benefit immensely if python would permit multiple, separate, interpreters in a single process. Or any other solution removing the bottlenecks that currently so limit multithreaded python performance. Ciao, Riccardo