[Numpy-discussion] very large matrices.

Charles R Harris charlesr.harris at gmail.com
Sat May 12 20:45:54 EDT 2007


On 5/12/07, Dave P. Novakovic <davidnovakovic at gmail.com> wrote:
>
> Hi,
>
> I have test data of about 75000 x 75000 dimensions. I need to do svd,
> or at least an eigen decomp on this data. from search suggests to me
> that the linalg functions in scipy and numpy don't work on sparse
> matrices.
>
> I can't even get  empty((10000,10000),dtype=float) to work (memory
> errors, or too many dims), I'm starting to feel like I'm in a bit of
> trouble here :)


Umm, big.

What do people use to do large svd's? I'm not adverse to using another
> lib or wrapping something.


What sort of machine do you have? There are column iterative methods for svd
that resemble Gram-Schmidt orthogonalization that could probably be adapted
to work over the array one column at a time. Are your arrays actually
sparse? Do you only need a few eigenvalues? Are you doing least squares? A
more precise description of the problem might lead to alternative , less
demanding, approaches.

Chuck
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20070512/1888efef/attachment.html>


More information about the NumPy-Discussion mailing list