It seems that the vectors produced by the Gram-Schmidt process may not be orthogonal (probably due to the presence of roundoff errors or when the matrix is ill conditionned). I would suggest to add an option to test for loss of orthogonality and reorthogonalize if needed. See http://www4.ncsu.edu/~ctk/newton/SOLVERS/nsoli.m for an example. regards, Christine
Tue, 23 Feb 2010 20:26:02 -0500, Darcoux Christine wrote:
It seems that the vectors produced by the Gram-Schmidt process may not be orthogonal (probably due to the presence of roundoff errors or when the matrix is ill conditionned). I would suggest to add an option to test for loss of orthogonality and reorthogonalize if needed. See http://www4.ncsu.edu/~ctk/newton/SOLVERS/nsoli.m for an example.
From the literature, it appears that orthogonality of the Krylov vectors is not very important for GMRES convergence, see e.g.
A. Greenbaum, M. Rozložník, Z. Strakoš BIT Numerical Mathematics, 37, 706 (1997). http://www.springerlink.com/content/277614q38kj6q376/ The GMRES implementations in Scipy use the modified Gram-Schmid procedure, so they should be OK in this respect. Do you have some evidence suggesting that reorthogonalization is a significant improvement for some problems? I'll accept patches providing such a flag in any case, though. Best regards, Pauli
Hi, I am not at all a specialist on this, but I recall that Numerical Recipes states that the Gram-Schmidt process gives terrible results, and should be avoided nearly always. Instead, singular value decomposition should be used. To avoid the problem below, would is be an idea to replace the GM algo by SVD? NIcky On 24 February 2010 13:07, Pauli Virtanen <pav+sp@iki.fi> wrote:
Tue, 23 Feb 2010 20:26:02 -0500, Darcoux Christine wrote:
It seems that the vectors produced by the Gram-Schmidt process may not be orthogonal (probably due to the presence of roundoff errors or when the matrix is ill conditionned). I would suggest to add an option to test for loss of orthogonality and reorthogonalize if needed. See http://www4.ncsu.edu/~ctk/newton/SOLVERS/nsoli.m for an example.
From the literature, it appears that orthogonality of the Krylov vectors is not very important for GMRES convergence, see e.g.
A. Greenbaum, M. Rozložník, Z. Strakoš BIT Numerical Mathematics, 37, 706 (1997). http://www.springerlink.com/content/277614q38kj6q376/
The GMRES implementations in Scipy use the modified Gram-Schmid procedure, so they should be OK in this respect.
Do you have some evidence suggesting that reorthogonalization is a significant improvement for some problems? I'll accept patches providing such a flag in any case, though.
Best regards, Pauli
_______________________________________________ SciPy-Dev mailing list SciPy-Dev@scipy.org http://mail.scipy.org/mailman/listinfo/scipy-dev
ke, 2010-02-24 kello 23:43 +0100, nicky van foreest kirjoitti:
I am not at all a specialist on this, but I recall that Numerical Recipes states that the Gram-Schmidt process gives terrible results, and should be avoided nearly always.
There's a balance here: more accurate methods tend to require more work, and the orthogonalization is probably the most expensive part of GMRES. So one should use a method that is good enough.
Instead, singular value decomposition should be used. To avoid the problem below, would is be an idea to replace the GM algo by SVD?
One could also QR factorize using Householder transforms, which should be more accurate than GS, and is probably less expensive than SVD; or reorthogonalize. But as far as I understand, this would not really improve the performance of GMRES: as shown in the article I linked to (and also in others), the quality of the orthogonalization is not expected to matter for GMRES -- modified Gram-Schmid is good enough. The point is that breakdown of orthogonality happens simultaneously with the solution reaching its maximal accuracy, so it will not essentially affect convergence or accuracy. I'd like to have a reference or a test case at hand stating otherwise before changing things here. As far as I understand, however, the situation with orthogonality is different if you want to solve eigenvalue problems with Arnoldi stuff. But that's in the domain of ARPACK, and I would trust without looking that the ARPACK people know what they are doing. Cheers, Pauli
Dear Pauli, Thank you for your quick answer with a pointer to the litterature. The result in the paper you cite assumes that the matrix is well-conditioned, and I'm not sure I can count on that in a general-purpose code which many people will use. The books "Solving nonlinear equations with Newton's method" and "Iterative methods for linear and nonlinear equations" (some parts are availlable from google book) by C. T. Kelley gives a good presentation on the importance of orthogonality. It also show a simple 3x3 example where paying attention to orthogonality is very important. Checking for orthogonality and reorthogonalizing the way there author of these books do it costs almost nothing (see the link to the code in my previous post). Matlab's "official" GMRES code uses Householder reflections to build the basis and gets orthogonality automatically, but at double the cost of Gram-Schmidt. The Sandia Trilinos code uses classical Gram-Schmidt for good parallel performance and reorthogonalizes at every iteration, so also costs double. The way proposed by Kelley costs nothing because the reorthogonalization happens rarely, so I do not see any reason not to do it. Of course, this could be added as an option so users will decide if they want to use it. Best Regards, Christine 2010/2/24, Pauli Virtanen <pav@iki.fi>:
ke, 2010-02-24 kello 23:43 +0100, nicky van foreest kirjoitti:
I am not at all a specialist on this, but I recall that Numerical Recipes states that the Gram-Schmidt process gives terrible results, and should be avoided nearly always.
There's a balance here: more accurate methods tend to require more work, and the orthogonalization is probably the most expensive part of GMRES. So one should use a method that is good enough.
Instead, singular value decomposition should be used. To avoid the problem below, would is be an idea to replace the GM algo by SVD?
One could also QR factorize using Householder transforms, which should be more accurate than GS, and is probably less expensive than SVD; or reorthogonalize.
But as far as I understand, this would not really improve the performance of GMRES: as shown in the article I linked to (and also in others), the quality of the orthogonalization is not expected to matter for GMRES -- modified Gram-Schmid is good enough. The point is that breakdown of orthogonality happens simultaneously with the solution reaching its maximal accuracy, so it will not essentially affect convergence or accuracy. I'd like to have a reference or a test case at hand stating otherwise before changing things here.
As far as I understand, however, the situation with orthogonality is different if you want to solve eigenvalue problems with Arnoldi stuff. But that's in the domain of ARPACK, and I would trust without looking that the ARPACK people know what they are doing.
Cheers, Pauli
_______________________________________________ SciPy-Dev mailing list SciPy-Dev@scipy.org http://mail.scipy.org/mailman/listinfo/scipy-dev
to, 2010-02-25 kello 14:16 -0500, Darcoux Christine kirjoitti:
Thank you for your quick answer with a pointer to the litterature.
The result in the paper you cite assumes that the matrix is well-conditioned, and I'm not sure I can count on that in a general-purpose code which many people will use.
Yes, it's a good point that the situation might not be the same for near singular matrices. Perhaps someone looked into that...
The books "Solving nonlinear equations with Newton's method" and "Iterative methods for linear and nonlinear equations" (some parts are availlable from google book) by C. T. Kelley gives a good presentation on the importance of orthogonality. It also show a simple 3x3 example where paying attention to orthogonality is very important.
I note that the book is from ca 1995, whereas the papers are newer. For the particular example in the book, [ 0.001 0 0; 0 0.0011 0; 0 0 1e4 ] MGS (which we currently have) required an additional step at the final stage of the convergence, as compared to GS + selective reorthogonalization. Up to the last step, the convergence histories were identical.
Checking for orthogonality and reorthogonalizing the way there author of these books do it costs almost nothing (see the link to the code in my previous post). Matlab's "official" GMRES code uses Householder reflections to build the basis and gets orthogonality automatically, but at double the cost of Gram-Schmidt. The Sandia Trilinos code uses classical Gram-Schmidt for good parallel performance and reorthogonalizes at every iteration, so also costs double. The way proposed by Kelley costs nothing because the reorthogonalization happens rarely, so I do not see any reason not to do it. Of course, this could be added as an option so users will decide if they want to use it.
Well, as I said, if someone implements it + writes tests, it will get in. If you are not going to implement it, please submit an enhancement request at http://projects.scipy.org/scipy to keep track of the idea. I don't have the time to work at this right now, though, but later is possible. Out of interest: do you have an actual matrix at hand where you know that MGS + selective reorthogonalization will give significantly better results than plain MGS, or a reference that shows an example like that? -- Pauli Virtanen
participants (4)
-
Darcoux Christine -
nicky van foreest -
Pauli Virtanen -
Pauli Virtanen