Hi python-dev,

I've seen that on and off CPython had attempts to measure benchmarks over time to avoid performance regressions (i.e.: https://speed.python.org), but had nothing concrete so far, so, I ended up creating a hosted service for that (https://www.speedtin.com) and I'd like to help in setting up a structure to run the benchmarks from https://hg.python.org/benchmarks/ and properly upload them to SpeedTin (if CPython devs are Ok with that) -- note that I don't really have server to run the benchmarks, only to host the data (but https://speed.python.org seems to indicate that such a server is available...).

There's a sample report at: https://www.speedtin.com/reports/1_CPython27x_Performance_Over_Time/ (it has real data from running using the PyPy benchmarks as I only discovered about the benchmarks from https://hg.python.org/benchmarks/ later on -- also, it doesn't seem to support Python 3 right now, so, it's probably not that useful for the current Python dev, but it does have some nice insight on CPython 2.7.x performance over time).

Later on, the idea is being able to compare across different Python implementations which use the same benchmark set... (although that needs other implementations to also post to the data to SpeedTin).

Note that uploading the data to SpeedTin should be pretty straightforward (by using https://github.com/fabioz/pyspeedtin, so, the main issue would be setting up o machine to run the benchmarks).

Best Regards,

Fabio