
I am working to make many of NumPy's matrix decomposition routines available in Sage. As part of testing a new routine, we have found some odd behavior with the singular value decomposition. On certain Mac's, the numpy built in Sage will return the second of the unitary matrices with a row of all zeros, so of course, it can't be unitary. For the configurations in question, this erroneous output happens only for certain sized matrices and for those sizes it always occurs. The smallest sizes are 3 x 4, 4 x 5, 5 x 6, 5 x 7, 6 x 7, 6 x 8, 6 x 9, 7 x 8, 7 x 9, 8 x 9. The fault is not in Sage code per se, as it can be reproduced by running Sage's python and using numpy directly. It could be possible Sage is not building numpy correctly, we have not tested a standalone version of numpy since this problem seems to be limited to very few configurations. The initial report, and a confirmation, are both on Macs where Sage is built using gcc 4.0.1 and gcc 4.2.1. The test that uncovered this situation was introduced two alpha releases back, and has not failed for testers on Linux or newer Macs. The svd routine itself has been in Sage for about three years without exhibiting any problems, but maybe the cases above were not tested. I do not own a Mac, so testing out scenarios involves sending suggestions to the two folks who have reported failures. Many more details and complete transcripts are at: http://trac.sagemath.org/sage_trac/ticket/11248 Any thoughts or advice to help us understand this would be greatly appreciated. Thanks in advance. Rob -- Robert A. Beezer Professor Department of Mathematics and Computer Science University of Puget Sound 1500 N Warner Tacoma, WA 98416-1043 beezer@ups.edu http://buzzard.ups.edu Voice: 253.879.3564 Fax: 253.879.3522