[Speed] What PyPy benchmarks are (un)important?

Brett Cannon brett at python.org
Thu Sep 13 00:35:57 CEST 2012


I went through the list of benchmarks that PyPy has to see which ones could
be ported to Python 3 now (others can be in the future but they depend on a
project who has not released an official version with python 3 support):

ai
chaos
fannkuch
float
meteor-contest
nbody_modified
richards
spectral-norm
telco

bm_chameleon*
bm_mako
go
hexiom2
json_bench
pidigits
pyflate-fast
raytrace-simple
sphinx*

The first grouping is the 20 shown on the speed.pypy.org homepage, the rest
are in the complete list. Anything with an asterisk has an external
dependency that is not already in the unladen benchmarks.

Are the twenty shown on the homepage of speed.pypy.org in some way special,
or were they the first benchmarks that you were good/bad at, or what? Are
there any benchmarks here that are particularly good or bad? I'm trying to
prioritize what benchmarks I port so that if I hit a time crunch I got the
critical ones moved first.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/speed/attachments/20120912/02932ae3/attachment.html>


More information about the Speed mailing list