[pypy-issue] [issue826] Decimal seems slower on pypy than cpython

Justin Peel tracker at bugs.pypy.org
Sat Aug 13 01:06:48 CEST 2011


Justin Peel <peelpy at gmail.com> added the comment:

Maybe this is beside the point, but the code isn't especially efficient. The 
largest problem IMO is that in the C(binomial coefficient) function, there is no 
need to convert to the Decimal class before doing the division. The reason that 
guibou was having problems with floating point overflow is because he imported 
division from __future__. If // is used instead of / and we don't use Decimal in 
C() - or only after the bigint math, we don't have this problem anymore. 
Binomial coefficients are always integers by the way. Making this change speeds 
up the code quite a lot for both pypy and cpython and even puts pypy a little 
bit ahead of cpython on my computer. There are other speed-ups that could be 
done with the code, but I won't bother.

This shouldn't take away from the fact that we might need to speed up Decimal. 
It appears from my experiments that the bigint to string conversion needed when 
creating a Decimal from a long or int is what is really slowing things down 
rather than the division of the Decimals.

----------
nosy: +justinpeel

________________________________________
PyPy bug tracker <tracker at bugs.pypy.org>
<https://bugs.pypy.org/issue826>
________________________________________


More information about the pypy-issue mailing list