[SciPy-User] solving large sparse linear system with Laplacian matrix

nicky van foreest vanforeest at gmail.com
Fri Oct 30 17:26:14 EDT 2009


Hi,

Would pysparse be an option perhaps? I did some tests with
scipy.sparse and pysparse, and the latter tends to build the matrices
(much) faster (for my cases, that is). It has been some time ago that
I installed pysparse, so I might be mistaken, but pysparse contains
also umfpack and superlu, which are compiled in the process if they
are not found on your system.

bye

Nicky

2009/10/30 Emmanuelle Gouillart <emmanuelle.gouillart at normalesup.org>:
> Dear list,
>
> I'm looking for a memory-savvy algorithm in scipy.sparse.linalg, that I
> could use to solve a very large sparse linear system. The system can be
> written as AX = B, where A is a symmetric band matrix with non-zero
> elements on only 4 or 6 upper (and lower) diagonals. The shape of A is
> NxN where N is very large (up to 10e6, if possible...), the shape of B
> and X is NxM, where M is much smaller (up to 50, say). Unfortunately, A
> is symmetric but not positive-definite as the sum of each row and column
> is null (A is a Laplacian matrix). B is also sparse (it is actually a
> block of the original Laplacian matrix).
>
> What kind of solver implemented in scipy would you recommend, so that I
> can solve such a system with N as large as possible (on a bottom-end
> computer with only 2Gb RAM)? Up to now I have used
> scipy.sparse.linalg.spsolve, and the CSR scipy.sparse format for A and B:
> this works fine for N as large as 300, but the memory requirements are
> too high for greater values of N on my computer. Should I use an
> iterative solver (I'm a complete newbie in linear algebra)? Also, spsolve
> requires that the right hand side is a flat vector, so I have to solve as
> many systems as the number of columns in B, which must be highly
> inefficient... Any way I can solve the "whole" linear system? Any hints
> will be much appreciated!
>
> For those curious about the background, I'm trying to implement Grady's
> random walker algorithm [1] to segment large 3-D X-ray tomography images.
> N=l**3 is the number of voxels in the cubic image, and M is the number of
> regions which I would like to segment. I don't require a very good
> precision on the elements of the solution X.
>
> Thanks in advance,
>
> Emmanuelle
>
> [1] L. Grady, "Random walks for image segmentation", IEEE Trans. on
> pattern analysis and machine intelligence, Vol. 28, p. 1768, 2006.
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>



More information about the SciPy-User mailing list