wired error message in scipy.sparse.eigen function: Segmentation fault

Dear all, I am using scipy '0.8.0.dev6120'. And the scipy.sparse.eigen function always produces error message. _Description:_ linalg.eigen(A, k=6, M=None, sigma=None, which='LM', v0=None, ncv=None, maxiter=None, tol=0, return_eigenvectors=True)_ Error messages:_ When I use this function in the way : linalg.eigen(A, k=2, return_eigenvectors=False) it produces error : *** glibc detected *** python: double free or corruption (!prev) when I use : linalg.eigen(A, k=4, return_eigenvectors=False) or linalg.eigen(A, k=8, return_eigenvectors=False) it produces error : Segmentation fault _My goal:_ "A" is an _unsymmetrical CSR sparse matrix_. What I am trying to do is : 1. find a node 's' to delete. For edge (u, v) in all the out-edges and in-edges of node 's', I set A[u, v] = 0.0. 2. calculate the largest eigenvalue using linalg.eigen(A, return_eigenvectors=False) 3. repeat 1-2 steps many times. I had used eigen_symmetric function to compute symmetric CSR sparse matrix and it works very well. But for the 'eigen' function, it is not working very well. Could you please help me about it? If it does not work, I have to rewrite my code in MATLAB, which is what I am trying to avoid. Thanks so much. Yours sincerely, Jankins

Jankins wrote:
Dear all,
I am using scipy '0.8.0.dev6120'. And the scipy.sparse.eigen function always produces error message.
_Description:_ linalg.eigen(A, k=6, M=None, sigma=None, which='LM', v0=None, ncv=None, maxiter=None, tol=0, return_eigenvectors=True)_
Could you provide your platform details (i.e. OS, compiler, 32 vs 64 bits, the output of scipy.show_config()). This is needed to isolate the problem, cheers. David

I tried on Ubuntu 9.10-32bit, gcc version 4.4.1, . Here is the information of show_config(): In [2]: scipy.show_config() umfpack_info: NOT AVAILABLE atlas_threads_info: NOT AVAILABLE blas_opt_info: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] language = c include_dirs = ['/usr/include'] atlas_blas_threads_info: NOT AVAILABLE lapack_opt_info: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas', '/usr/lib'] define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] language = f77 include_dirs = ['/usr/include'] atlas_info: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas', '/usr/lib'] language = f77 include_dirs = ['/usr/include'] lapack_mkl_info: NOT AVAILABLE blas_mkl_info: NOT AVAILABLE atlas_blas_info: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c include_dirs = ['/usr/include'] mkl_info: NOT AVAILABLE Thanks so much. On 1/27/2010 9:06 PM, David Cournapeau wrote:
Jankins wrote:
Dear all,
I am using scipy '0.8.0.dev6120'. And the scipy.sparse.eigen function always produces error message.
_Description:_ linalg.eigen(A, k=6, M=None, sigma=None, which='LM', v0=None, ncv=None, maxiter=None, tol=0, return_eigenvectors=True)_
Could you provide your platform details (i.e. OS, compiler, 32 vs 64 bits, the output of scipy.show_config()). This is needed to isolate the problem,
cheers.
David _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

Jankins wrote:
I tried on Ubuntu 9.10-32bit, gcc version 4.4.1, . Here is the information of show_config():
Sorry, I forgot an additional information, the exact atlas you are using. For example, assuming scipy is installed in /usr/local, I would need the output of /usr/local/lib/python2.6/site-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so (you are using linalg.eigen from scipy.sparse, right ?). Ideally, if the matrix is not too big, having the matrix which crashes scipy is most helpful, thanks, David

Yes. I am using scipy.sparse.linalg.eigen.arpack. The exact output is: /usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so In fact, the matrix is from a directed graph with about 18,000 nodes and 41,000 edges. Actually, this matrix is the smallest one I used. Now I switch to use numpy.linalg.eigvals, but it is slower than scipy.sparse.linalg.eigen.arpack module. Thanks. Jankins On 1/27/2010 9:36 PM, David Cournapeau wrote:
the exact atlas you are using. For example, assuming scipy is insta

Jankins wrote:
Yes. I am using scipy.sparse.linalg.eigen.arpack.
The exact output is:
/usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so
I need the output of ldd on this file, actually, i.e the output of "ldd /usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so". It should output the libraries actually loaded by the OS.
In fact, the matrix is from a directed graph with about 18,000 nodes and 41,000 edges. Actually, this matrix is the smallest one I used.
Is it available somewhere ? 41000 edges should make the matrix very sparse. I first thought that your problem may be some buggy ATLAS, but the current arpack interface (the one used by sparse.linalg.eigen) is also quite buggy in my experience, though I could not reproduce it. Having a matrix which consistently reproduce the bug would be very useful. In the short term, you may want to do without arpack support in scipy. In the longer term, I intend to improve support for sparse matrices linear algebra, as it is needed for my new job.
Now I switch to use numpy.linalg.eigvals, but it is slower than scipy.sparse.linalg.eigen.arpack module.
If you have a reasonable ATLAS install, scipy.linalg.eigvals should actually be quite fast. Sparse eigenvalues solver are much slower than full ones in general as long as: - your matrices are tiny (with tiny defined here as the plain matrix requiring one order of magnitude less memory than the total available memory, so something like matrices with ~ 1e7/1e8 entries on current desktop computers) - you need more than a few eigenvalues, or not just the biggest/smallest ones cheers, David

On Thu, Jan 28, 2010 at 12:11 AM, David Cournapeau <david@silveregg.co.jp> wrote:
Jankins wrote:
Yes. I am using scipy.sparse.linalg.eigen.arpack.
The exact output is:
/usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so
I need the output of ldd on this file, actually, i.e the output of "ldd /usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so". It should output the libraries actually loaded by the OS.
In fact, the matrix is from a directed graph with about 18,000 nodes and 41,000 edges. Actually, this matrix is the smallest one I used.
Is it available somewhere ? 41000 edges should make the matrix very sparse. I first thought that your problem may be some buggy ATLAS, but the current arpack interface (the one used by sparse.linalg.eigen) is also quite buggy in my experience, though I could not reproduce it. Having a matrix which consistently reproduce the bug would be very useful.
In the short term, you may want to do without arpack support in scipy. In the longer term, I intend to improve support for sparse matrices linear algebra, as it is needed for my new job.
Now I switch to use numpy.linalg.eigvals, but it is slower than scipy.sparse.linalg.eigen.arpack module.
If you have a reasonable ATLAS install, scipy.linalg.eigvals should actually be quite fast. Sparse eigenvalues solver are much slower than full ones in general as long as: - your matrices are tiny (with tiny defined here as the plain matrix requiring one order of magnitude less memory than the total available memory, so something like matrices with ~ 1e7/1e8 entries on current desktop computers) - you need more than a few eigenvalues, or not just the biggest/smallest ones
cheers,
David
You are using Atlas version 3.6, perhaps you should upgrade to a more recent version (3.8.x)? What version of numpy are you using? Where did Atlas etc come from? Did you install both numpy and scipy from scratch (preferably built at the same time against the same library versions)? Sometimes removing everything and then rebuilding or reinstalling everything from scratch can help Perhaps less of a concern, but since your OS is 32-bit, is everything 32-bit and do you have sufficient memory for the system to run your code? After that, the array and code in question is need. Bruce

On Thu, Jan 28, 2010 at 3:11 PM, David Cournapeau <david@silveregg.co.jp> wrote:
Jankins wrote:
Yes. I am using scipy.sparse.linalg.eigen.arpack.
The exact output is:
/usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so
I need the output of ldd on this file, actually, i.e the output of "ldd /usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so". It should output the libraries actually loaded by the OS.
In fact, the matrix is from a directed graph with about 18,000 nodes and 41,000 edges. Actually, this matrix is the smallest one I used.
Is it available somewhere ? 41000 edges should make the matrix very sparse. I first thought that your problem may be some buggy ATLAS, but the current arpack interface (the one used by sparse.linalg.eigen) is also quite buggy in my experience, though I could not reproduce it. Having a matrix which consistently reproduce the bug would be very useful.
Ok, I took a look at it, and unfortunately, it is indeed most likely an ATLAS problem. I get crashes when scipy is linked against Atlas (v3.8.3), but if I link against plain BLAS/LAPACK, I don't get any crash anymore (and valgrind does not complain). I will try with a recent development from atlas, cheers, David

This problem keeps bothering me for days. If you need more sample to test it, I got one more. I tested it this morning. And the "segmentation fault" happened at a specific place. I guess, finally, I have to refer to the original eigenvalue algorithm or Matlab. Thanks. On Fri, Feb 5, 2010 at 1:22 AM, David Cournapeau <cournape@gmail.com> wrote:
On Thu, Jan 28, 2010 at 3:11 PM, David Cournapeau <david@silveregg.co.jp> wrote:
Jankins wrote:
Yes. I am using scipy.sparse.linalg.eigen.arpack.
The exact output is:
/usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so
I need the output of ldd on this file, actually, i.e the output of "ldd
/usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so".
It should output the libraries actually loaded by the OS.
In fact, the matrix is from a directed graph with about 18,000 nodes and 41,000 edges. Actually, this matrix is the smallest one I used.
Is it available somewhere ? 41000 edges should make the matrix very sparse. I first thought that your problem may be some buggy ATLAS, but the current arpack interface (the one used by sparse.linalg.eigen) is also quite buggy in my experience, though I could not reproduce it. Having a matrix which consistently reproduce the bug would be very useful.
Ok, I took a look at it, and unfortunately, it is indeed most likely an ATLAS problem. I get crashes when scipy is linked against Atlas (v3.8.3), but if I link against plain BLAS/LAPACK, I don't get any crash anymore (and valgrind does not complain).
I will try with a recent development from atlas,
cheers,
David _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

On Sat, Feb 6, 2010 at 9:25 AM, Jankins <andyjian430074@gmail.com> wrote:
This problem keeps bothering me for days. If you need more sample to test it, I got one more. I tested it this morning. And the "segmentation fault" happened at a specific place. I guess, finally, I have to refer to the original eigenvalue algorithm or Matlab.
Hm, I found something which makes valgrind happy, but I am not sure whether the fix is right. There is definitely something wrong in vector sizes (the crash happens within dnaitr, and some arrays are accessed outside bounds), but the input arguments constraints are all valid if I read the ARPACK sources correctly. It just seems that your data uncover a corner case not well handled by ARPACK. Making the buffers big enough seems to cause the algorithm not to converge for your data, though (to see by yourself, try making ncv argument to eigen biffer than its default 2k+1 value). Since Matlab also uses ARPACK, I would be interested in knowing how matlab behaves for your data, cheers, David
participants (4)
-
Bruce Southey
-
David Cournapeau
-
David Cournapeau
-
Jankins