[Numpy-discussion] very large matrices.

Dave P. Novakovic davidnovakovic at gmail.com
Sat May 12 20:58:02 EDT 2007


Hey, thanks for the response.

core 2 duo with 4gb RAM.

I've heard about iterative svd functions. I actually need a complete
svd, with all eigenvalues (not LSI). I'm actually more interested in
the individual eigenvectors.

As an example, a single row could probably have about 3000 non zero elements.

I think I'll try outputting a sparse matrix file and using svdlibc. If
this works I'll wrap svdlibc with ctypes  and post the results back
here.

i just wanted to make sure there was absolutely no way of doing it
with sci/numpy before i looked at anything else.

Cheers

Dave


On 5/13/07, Charles R Harris <charlesr.harris at gmail.com> wrote:
>
>
> On 5/12/07, Dave P. Novakovic <davidnovakovic at gmail.com> wrote:
> > Hi,
> >
> > I have test data of about 75000 x 75000 dimensions. I need to do svd,
> > or at least an eigen decomp on this data. from search suggests to me
> > that the linalg functions in scipy and numpy don't work on sparse
> > matrices.
> >
> > I can't even get  empty((10000,10000),dtype=float) to
> work (memory
> > errors, or too many dims), I'm starting to feel like I'm in a bit of
> > trouble here :)
>
> Umm, big.
>
> > What do people use to do large svd's? I'm not adverse to using another
> > lib or wrapping something.
>
> What sort of machine do you have? There are column iterative methods for svd
> that resemble Gram-Schmidt orthogonalization that could probably be adapted
> to work over the array one column at a time. Are your arrays actually
> sparse? Do you only need a few eigenvalues? Are you doing least squares? A
> more precise description of the problem might lead to alternative , less
> demanding, approaches.
>
> Chuck
>
>
>
> _______________________________________________
> Numpy-discussion mailing list
> Numpy-discussion at scipy.org
> http://projects.scipy.org/mailman/listinfo/numpy-discussion
>
>



More information about the NumPy-Discussion mailing list