![](https://secure.gravatar.com/avatar/9820b5956634e5bbad7f4ed91a232822.jpg?s=120&d=mm&r=g)
Keith Goodman wrote:
On Nov 13, 2007 8:42 PM, David Cournapeau <david@ar.media.kyoto-u.ac.jp> wrote:
Here we are:
http://www.ar.media.kyoto-u.ac.jp/members/david/archives/numpy-1.0.4.win32-p...
Thank you. He said it worked. He didn't even notice a slow down without ATLAS. On some calculations the results were different (between 1.0.2 and 1.0.4) in the last three decimal places. But that's to be expected, right? I don't think there is any chance to have exactly the same results (compiler/OS/CPU/BLAS all enter the equation). ATLAS will not change much the general performances of numpy: this only enter the equation for some functions (numpy.dot) and linear algebra of course, for relatively big numbers. For example, in my use case (linear algebra with maximum a few tens dimensions), ATLAS does not give much outside numpy.dot. And anyway, if you want good performance from atlas, you should compile it by yourself (ATLAS performance seems to really depend on the size of L1 cache, for example).
So all in all, I think it worths considering just using netlib BLAS/LAPACK instead of ATLAS for binaries, at least on windows (I don't know who is responsible for the windows binaries); note that we still do not know why the official binaries hang, which is bothering. cheers, David