[pypy-issue] Issue #2502: Recipes for a loop with a lot of small functions (splitting long traces?) (pypy/pypy)
issues-reply at bitbucket.org
Fri Mar 17 22:32:28 EDT 2017
New issue 2502: Recipes for a loop with a lot of small functions (splitting long traces?)
I'm dealing a big loop that is expected to execute for 10M-1B iterations. Inside the iteration I will have 500 functions. Inside each function there is not much work going on, basically 5-10 lines of calculations or for i in xrange(2) small loops.
I wonder if there is a way to write code differently to avoid generating too long traces.
for i in xrange(1000000000):
The speedup of pypy over cpython is negligible. I looked at jit log and found that there are too many (~100) trace too long aborts. Is there anyway to still get high performance (or let the performance scale with the number of functions) in this case?
More information about the pypy-issue