
On 25. apr. 2011, at 19.57, Pauli Virtanen wrote:
On Mon, 25 Apr 2011 10:16:13 -0700, Rob Beezer wrote: [clip]
Many more details and complete transcripts are at: http://trac.sagemath.org/sage_trac/ticket/11248
Any thoughts or advice to help us understand this would be greatly appreciated.
The Numpy routine is a very thin wrapper of LAPACK's ZGESDD, and probably cannot have any bugs of this kind, so the problem is most likely with the LAPACK and BLAS libraries you use. You will probably be able to reproduce the problem also with an equivalent Fortran/C snippet calling LAPACK directly.
Problems like this in BLAS/LAPACK are somewhat difficult to track. You could try switching to another BLAS library (or, if you use ATLAS, compile it differently) and checking if the problem disappears.
-- Pauli Virtanen
I cannot claim anything concrete, but I did notice something seemingly really odd about the blas+lapack that ships with macosx. In our light scattering simulation code we use energy conservation as a test of consistency. When running small test runs on my macbook pro, the energy would be significantly less well conserved compared to when the exact same simulation was run on a linux box. No harm done, we don't care about the exact results on the laptop, but it did seem odd. We do not rely on SVD, however, we only use LU+backsubstitution (cgesv). I have a vague feeling that there is something odd about lapack+blas on macosx, but I do not have any hard evidence. Paul