Interesting timing issue I noticed

Gabriel Genellina gagsl-py2 at yahoo.com.ar
Wed Apr 16 14:54:21 EDT 2008


En Wed, 16 Apr 2008 10:36:14 -0300, Jonathan Shao <jzshao1 at gmail.com>  
escribió:
> *Gabriel Genellina* gagsl-py2 at yahoo.com.ar
> <python-list%40python.org?Subject=Interesting%20timing%20issue%20I%20noticed&In-Reply-To=>
> *Wed Apr 16 08:44:10 CEST 2008*
>> Another thing would be to rearrange the loops so the outer one executes
> less times; if you know that borderX<<sizeX and borderY<<sizeY it may be
> better to swap the inner and outer loops above.
> Thank you for the tip on xrange.
> Even if I swap the inner and outer loops, I would still be doing the same
> number of computations, am I right (since I still need to go through the
> same number of elements)? I'm not seeing how a loop swap would lead to  
> fewer
> computations, since I still need to calculate the outer rim of elements  
> in
> the array (defined by borderX and borderY).

You minimize the number of "for" statements executed, and the number of  
xrange objects created. Both take some time in Python.

<code>
import timeit

f1 = """
for i in xrange(10):
   for j in xrange(1000):
     i,j
"""

f2 = """
for j in xrange(1000):
   for i in xrange(10):
     i,j
"""

print timeit.Timer(f1).repeat(number=1000)
print timeit.Timer(f2).repeat(number=1000)
</code>

Output:
[2.0405478908632233, 2.041863979919242, 2.0397852240997167]
[2.8623411634718821, 2.8330055914927783, 2.8361752680857535]

The second (using the largest outer loop) is almost 40% slower than the  
first one. "Simplified" Big-Oh complexity analysis isn't enough in cases  
like this.

-- 
Gabriel Genellina




More information about the Python-list mailing list