perf 0.2 released, perf fork of CPython benchmark suite
Hi,
I completed the API of my small perf module and released a version 0.2: https://perf.readthedocs.io/
It is supposed to provide the basic tools to collect samples, compute the average, display the result, etc. I started to work on JSON serialization to "easily" run multiple processes. The idea is also to be split the code to produce numbers and the code to display results. I expect that we can do better to display results. See for example speed.python.org and speed.pypy.org, it's nicer than perf.py text output ;-)
I also started to hack CPython benchmark suite (benchmarks repository) to use my perf module: https://hg.python.org/sandbox/benchmarks_perf
I should now stop NIH and see how to merge my work with PyPy fork of benchmarks ;-)
FYI I started to write the perf module because I started to write an article about the impact of CPU speed on Python microbenchmarks, and I wanted to have a smart timeit running multiple processes. Since it was cool to work on such project, I started to hack benchmarks, but maybe I gone too far and I should look at PyPy's benchmark instead ;-)
Victor
participants (1)
-
Victor Stinner