[Numpy-discussion] Correlation function about a factor of 100 slower than matlab/mathcad ... but with fft even worse ?

qubax at gmx.at qubax at gmx.at
Wed Nov 25 13:23:20 EST 2009


The correlation of a large data (about 250k points) v can be checked
via

  correlate(v,v,mode='full')

and ought to give the same result as the matlab function

  xcorr(v)

FFT might speed up the evaluation ...

In my specific case:

  xcorr takes about 0.2 seconds.
  correlate takes about 70 seconds.
  fftconvolve takes about 400 seconds.

on the irc channel a second person happened to run into the same
problem using mathcad and data consisting of about 300k points:

  correl takes about 1 second
  correlate takes 127 seconds
  fftconvolve was aborded because it took to long

These tests were checked and confirmed by two other persons on the
irc channel. The computers involved were 32bit as well as 64bit
machines. All 4 persons are sure that lapack/atlas libraries are
properly installed.

Could someone please investigate why correlate and especially
fftconvolve are orders of magnitude slower?

Should more details / sample data be required, please let me know.

Thanks,
q


----------------------

executed code:

tic=time.time()
cor_c1=correlate(c1data[:,1],c1data[:,1],mode='full')
toc=time.time()

tic=time.time()
cor_c1=fftconvolve(c1data[:,1],c1data[:,1],mode='full')
toc=time.time()

xcorr(data)


-- 
The king who needs to remind his people of his rank, is no king.

A beggar's mistake harms no one but the beggar. A king's mistake,
however, harms everyone but the king. Too often, the measure of
power lies not in the number who obey your will, but in the number
who suffer your stupidity.



More information about the NumPy-Discussion mailing list