LinearAlgebraError: SVD did not converge

Robin Becker robin at jessikat.fsnet.co.uk
Tue Aug 27 14:17:16 EDT 2002


.
....
>> In your case, you should try to identify where the problem is by computing the 
>> SVD decomposition of your matrix first on its own, and looking at the 
>> singular value spectrum. What's the largest to smallest ratio? That's the 
>> condition number of your matrix, and a large one will be an indication that 
>> your matrix is nearly singular (numerically), hence hell to invert. If that's 
>> the case, you'll need to think a bit about your problem, since chances are a 
>> black box inverter will always fail with a near-singular matrix.
>
>Thanks for the response, but actually the error actually is from the
>singular_value_decomposition() routine.  It doesn't matter whether
>I call it (my usual mode of operation) or let generalized_inverse()
>call it.
>
>You say "the SVD decomposition always exists".  I agree.  But
>apparently LinearAlgebra.singular_value_decomposition() doesn't
>know that.
>
>Damian Menscher [is frustrated]
SVD is often computed by an iterative method ie eigenvalue shifting is used.

 I have this lying in one of my C routines

        dpsvd is an efficient method (see [1]), computing the singular sub-
        space of a matrix corresponding to its smallest singular values. It
        differs from the classical SVD algorithm [3] at three points, which
        results in high efficiency.
        First, the Householder transformations of the bidiagonalization need
        only to be applied on the base vectors of the desired singular sub-
        spaces.
        Second, the bidiagonal needs only to be partially diagonalized.
        Third, the convergence rate of the iterative diagonalization can be
        improved by an appropriate choice between QL and QR iterations.
        Depending on the gap, the desired numerical accuracy and the dimen-
        sion of the desired singular subspace, PSVD can be three times faster
        than the classical SVD algorithm.
        
        The PSVD algorithm [1-2] for an M by N matrix A proceeds as follows:
        
        Step 1: Bidiagonalization phase
                -----------------------
        1.a):   If M >= 5*N/3, transform A into upper triangular form R.
        1.b):   Transform A (or R) into bidiagonal form:
        
                        !q(1) e(2)  0   ...  0  !
                 (0)    ! 0   q(2) e(3)      .  !
                J   =   ! .                  .  !
                        ! .                 e(N)!
                        ! 0            ...  q(N)!
        
                if M >= N, or
        
                        !q(1) e(2)  0   ...  0     0   !
                 (0)    ! 0   q(2) e(3)      .     .   !
                J   =   ! .                  .     .   !
                        ! .                 e(M)   .   !
                        ! 0            ...  q(M) e(M+1)!
        
                if M < N, using Householder transformations.
        
        1.c):   If U is requested, initialize U with the identity matrix.
                If V is requested, initialize V with the identity matrix.
        
        1.d):   If M < N, then cancel e(M+1), and reduce the bidiagonal to
                M x M. Accumulate the Givens rotations in V (if V is desired).
        
        Step 2: Partial diagonalization phase
                -----------------------------
                If the upper bound THETA is not given, then compute THETA such that
                precisely p - RANK singular values (p=min(M,N)) of the bidiagonal
                are <= THETA, using a bisection method [4].
                Diagonalize the given bidiagonal J partially, using either QL itera-
                tions (if the upper left diagonal element of the considered subbi-
                diagonal > the lower right diagonal element) or QR iterations, such
                that J is splitted into unreduced subbidiagonals whose singular
                values are either all larger than THETA or all less than or equal to
                THETA.
                Accumulate the Givens rotations in U and/or V (if desired).
        
        Step 3: Back transformation phase
                -------------------------
        3.a):   Apply the Householder transformations of step 1.b) onto the
                columns of U and/or V associated with the subbidiagonals with
                all singular values <= THETA, (if U and/or V is desired).
        
        3.b):   If M >= 5*N/3 and U is desired, then apply the Householder
                transformations of step 1.a) onto each computed column of U in
                step 3.a).
        
                NOTE. If M > N (resp.,M < N), then the base vectors of the orthogonal
                complement of the column (resp.,row) space of the M by N matrix A can
                also be computed if desired (see MODE) by applying step 3 onto the
                last M - N (resp.,N - M) columns of U (resp.,V).
        
        REFERENCES:
                [1]     S. Van Huffel, J. Vandewalle and A. Haegemans, An efficient and
                        reliable algorithm for computing the singular subspace of a
                        matrix associated with its smallest singular values.
                        J. Comput. and Applied Math., 19 (1987), 313 - 330.
                [2]     S. Van Huffel, Analysis of the total least squares problem and
                        its use in parameter estimation. Doctoral dissertation, Dept. of
                        Electr. Eng., K.U.Leuven, June 1987.
                [3]     T.F.Chan, An improved algorithm for computing the singular value
                        decomposition. ACM Trans. Math. Software, 8 (1982), 72-83.
                [4]     S. Van Huffel and J. Vandewalle, The Partial Total Least
                        Squares Algorithm. J. Comput. and Applied Math., 21 (1988), to
                        appear.
                [5]     C.L. Lawson, R.J. Hanson, F.T. Krogh and O.R. Kincaid, Basic Lin-
                        ear Algebra Subprograms for FORTRAN Usage. ACM Trans. Math. Soft-
                        ware, 5 (1979), 308-323.
                [6]     J.J. Dongarra, J.R. Bunch, C.B. Moler and G.W. Stewart, LINPACK
                        User's Guide. SIAM, Philadelphia (1979).

        9 NUMERICAL ASPECTS:
        
        Using PSVD a large reduction in computation time can be gained in
        total least squares applications (cf [2 - 4]), in the computation of
        the null space of a matrix and in solving (non)homogeneous linear
        equations.
        
        Sabine VAN HUFFEL
        ESAT Laboratory, KU Leuven.
        Kardinaal Mercierlaan 94, 3030 Heverlee, Belgium.
        revisions: 1988, February 15

        This C version RGB DAS July 1993

-- 
Robin Becker



More information about the Python-list mailing list