
Oct. 25, 2010
2:42 a.m.
Hi everyone, I am trying to compute the eigenvectors corresponding to the d+1 smallest eigenvalues of A=W.T*W. I started with W as a dense matrix and then W = sparse.csr_matrix(W) A = W.dot(W) # W.T * W W,V = eigen_symmetric(A,d+1, which='SM') The biggest problem is that the algorithm fails to converge and I get all zeros as eigenvectors for a testing dataset. Using dense SVD I got the expected results. The second problem is that this sparse version is much slower than the dense version as u,s,vh = svd(W) The testing data only has 1000x1000, while I expect the real data will have millions by millions of entries. Each row will have only a dozen to at most dozes of non-zero entries. Thanks, Hao