Interesting timing issue I noticed

Jonathan Shao jzshao1 at
Tue Apr 15 02:06:48 CEST 2008

The project I'm working on is motion detection, involving a bit of image
processing. No worries: no image processing background needed.

Suffice to say that I initially wrote a script that goes through every pixel
of a 320x240 picture (turned into an array using PIL) and performs some
calculatiosn. It simply goes through every pixel in the array and performs a
simple subtraction with a known value. The idea is to try to find
differences between the two images.

After a while, to try to speed up the calculations, I realized that I didn't
need to do all 320x240 calculations. So I implemented a slightly more
sophisticated algorithm and localized my calculations. I still do the pixel
subtractions, but I do it on a smaller scale.

Surprisingly, when I used time.time() to time the procedures, I find that
doing all 320x240 calculations are often faster! On my machine, the former
gives me on average an execution time of around 0.125s (and consistently),
whereas the latter on average takes 0.160s.

Why does this happen?

"Perhaps we all give the best of our hearts uncritically, to those who
hardly think about us in return."
~ T.H.White
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the Python-list mailing list