numpy (matrix solver) - python vs. matlab

someone newsboost at gmail.com
Wed May 2 16:25:11 EDT 2012


On 05/02/2012 08:36 AM, Russ P. wrote:
> On May 1, 11:03 pm, someone<newsbo... at gmail.com>  wrote:
>> On 05/02/2012 01:38 AM, Russ P. wrote:
..
>>> On May 1, 4:05 pm, Paul Rubin<no.em... at nospam.invalid>    wrote:
>> It would really appreciate if anyone could maybe post a simple SVD
>> example and tell what the vectors from the SVD represents geometrically
>> / visually, because I don't understand it good enough and I'm sure it's
>> very important, when it comes to solving matrix systems...
>
> SVD is perhaps the ultimate matrix decomposition and the ultimate tool
> for linear analysis. Google it and take a look at the excellent
> Wikipedia page on it. I would be wasting my time if I tried to compete
> with that.

Ok.

> To really appreciate the SVD, you need some background in linear
> algebra. In particular, you need to understand orthogonal
> transformations. Think about a standard 3D Cartesian coordinate frame.
> A rotation of the coordinate frame is an orthogonal transformation of
> coordinates. The original frame and the new frame are both orthogonal.

Yep.

> A vector in one frame is converted to the other frame by multiplying
> by an orthogonal matrix. The main feature of an orthogonal matrix is
> that its transpose is its inverse (hence the inverse is trivial to
> compute).

As far as i know, you have to replace orthogonal with: orthonormal. That 
much I at least think I know without even going to wikipedia first...

> The SVD can be thought of as factoring any linear transformation into
> a rotation, then a scaling, followed by another rotation. The scaling
> is represented by the middle matrix of the transformation, which is a
> diagonal matrix of the same dimensions as the original matrix. The
> singular values can be read off of the diagonal. If any of them are
> zero, then the original matrix is singular. If the ratio of the
> largest to smallest singular value is large, then the original matrix
> is said to be poorly conditioned.

Aah, thank you very much. I can easily recognize some of this...

> Standard Cartesian coordinate frames are orthogonal. Imagine an x-y
> coordinate frame in which the axes are not orthogonal. Such a
> coordinate frame is possible, but they are rarely used. If the axes
> are parallel, the coordinate frame will be singular and will basically
> reduce to one-dimensional. If the x and y axes are nearly parallel,
> the coordinate frame could still be used in theory, but it will be
> poorly conditioned. You will need large numbers to represent points
> fairly close to the origin, and small deviations will translate into
> large changes in coordinate values. That can lead to problems due to
> numerical roundoff errors and other kinds of errors.

Thank you very much for your time. It always helps to get the same 
explanation from different people with slightly different ways of 
explaining it. Thanks!



More information about the Python-list mailing list