Testing for performance regressions
drsalists at gmail.com
Tue Apr 5 05:31:55 CEST 2011
On Mon, Apr 4, 2011 at 7:45 PM, Steven D'Aprano <
steve+comp.lang.python at pearwood.info> wrote:
> I'm writing some tests to check for performance regressions (i.e. you
> change a function, and it becomes much slower) and I was hoping for some
> guidelines or hints.
> This is what I have come up with so far:
> * The disclaimers about timing code snippets that can be found in the
> timeit module apply. If possible, use timeit rather than roll-you-own
> * Put performance tests in a separate test suite, because they're
> logically independent of regression tests and functional tests, and
> therefore you might not want to run them all the time.
> * Never compare the speed of a function to some fixed amount of time,
> since that will depend on the hardware you are running on, but compare it
> relative to some other function's running time. E.g.:
> # Don't do this:
> time_taken = Timer(my_func).timeit() # or similar
> assert time_taken <= 10
> # This is bad, since the test is hardware dependent, and a change
> # in environment may cause this to fail even if the function
> # hasn't changed.
> # Instead do this:
> time_taken = Timer(my_func).timeit()
> baseline = Timer(simple_func).timeit()
> assert time_taken <= 2*baseline
> # my_func shouldn't be more than twice as expensive as simple_func
> # no matter how fast or slow they are in absolute terms.
> Any other lessons or hints I should know?
> If it helps, my code will be targeting Python 3.1, and I'm using a
> combination of doctest and unittest for the tests.
I suppose you could compare to a pystone result times some constant.
FWIW, doctest is a cool idea, but it kind of limits your options, as it
enshrines little details that'll cause your tests to fail if you move to
Pypy or Jython or IronPython or whatever.
I tend to lump my performance-related tests in with my other tests, but
perhaps this is a personal preference thing. So of course, I try to keep my
performance tests brief - sometimes with the non-default option of doing a
more thorough test. Because the time to know that things have suddenly
slowed way down is during development, not right before a new release.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Python-list