Long running process - how to speed up?
Dennis Lee Bieber
wlfraed at ix.netcom.com
Sun Feb 20 22:34:13 EST 2022
On Sun, 20 Feb 2022 18:05:33 +0000, Shaozhong SHI <shishaozhong at gmail.com>
declaimed the following:
>I am trying this approach,
>
>import multiprocessing as mp
>
>def my_func(x):
> print(x**x)
>
Not much of a processing load there, especially for your small set of
integers. I suspect you are consuming a significant time just having the OS
create and tear down each process.
In truth, for that example, I suspect plain threads will run faster
because you don't have the overhead of setting up a process with new
stdin/stdout/stderr. The exponential operation probably runs within one
Python threading quantum, and the print() will trigger a thread switch to
allow the next one to compute.
>def main():
> pool = mp.Pool(mp.cpu_count())
> result = pool.map(my_func, [4,2,3])
-=-=-
>>> import multiprocessing as mp
>>> mp.cpu_count()
8
>>>
-=-=-
Really under-loaded on my Win10 system (hyperthreaded processors count
as 2 CPUs, so a quad-core HT reports as 8 CPUs). Even an older
Raspberry-Pi 3B (quad core) reports
-=-=-
md_admin at microdiversity:~$ python3
Python 3.7.3 (default, Jan 22 2021, 20:04:44)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import multiprocessing as mp
>>> mp.cpu_count()
4
>>> exit()
md_admin at microdiversity:~$
-=-=-
--
Wulfraed Dennis Lee Bieber AF6VN
wlfraed at ix.netcom.com http://wlfraed.microdiversity.freeddns.org/
More information about the Python-list
mailing list