nearest neighbor in 2D
Alex Martelli
aleax at aleax.it
Tue Nov 4 10:20:39 EST 2003
Andrew Dalke wrote:
> Ron Adam
>> for point in p:
>> distance = math.sqrt((new_point[0]-point[0])**2 \
>> +(new_point[1]-point[1])**2)
>
>> I really don't know how you can make this faster. There might be a
Hmmm, that's what math.hypot is for, isn't it...?
[alex at lancelot Lib]$ timeit.py -c -s'import math; p=1.6,2.5; np=2.4,1.3'
'math.sqrt((np[0]-p[0])**2 + (np[1]-p[1])**2)'
100000 loops, best of 3: 3 usec per loop
[alex at lancelot Lib]$ timeit.py -c -s'import math; p=1.6,2.5; np=2.4,1.3'
'math.hypot(np[0]-p[0], np[1]-p[1])'
100000 loops, best of 3: 1.9 usec per loop
>> library that has a distance between two points function that could
>> speed it up.
>
> An easy way is to move the math.sqrt call outside the loop, since
> sqrt(d1) < sqrt(d2) iff d1 < d2 (when d1,d2>=0)
Yes, omitting the math.sqrt gives the same speed as calling math.hypot,
and it's the classic solution to speed up minimum-distance problems.
I vaguely suspect you could shave some further fraction of a microsecond
by saving those differences as dx and dy and then computing dx*dx+dy*dy --
since another classic tip is that a**2 is slower than a*2. Let's see...:
[alex at lancelot Lib]$ timeit.py -c -s'import math; p=1.6,2.5; np=2.4,1.3'
'dx=np[0]-p[0]; dy=np[1]-p[1]; disq=dx*dx+dy*dy'
1000000 loops, best of 3: 1.39 usec per loop
...yep, another small enhancement. Ain't measuring _FUN_?-)
Alex
More information about the Python-list
mailing list