# [PYTHON MATRIX-SIG] linear algebra

**Carlos Fonseca
**
fonseca@gaivota.demon.co.uk

*Mon, 19 Aug 1996 22:10:37 +0100 (BST)*

On Mon, 19 Aug 1996, Jim Hugunin wrote:
>* All of the current interface routines (from LinAlg.py) do the following:
*>*
*>* return transpose(lapackfunction(transpose(input_matrix)))
*>*
*>* Should these two transpositions be there? This is obviously related to
*>* the different index order between NumPy and FORTRAN. Still, I've always
*>* personally considered it a superficial difference that is best ignored.
*>*
*>* If I remember by Linear Algebra correctly
*>* transpose(inverse(transpose(m))) == inverse(m).
*>*
*>* However, I very rarely use any linear algebra functions in my code so
*>* I'm not the right person to be making that decision. Opinions?
*>*
*>* -Jim
*
As long as the result is mathematically equivalent, there should be no
need to have transpose() there. Any equivalences of this kind should
be exploited as much as possible, even if they apply to only a small
number of operations.
Thus, if Lapack implements C= A^(-1).B efficiently (in Fortran style),
then Python should compute c=b.a^(-1) equally efficiently, just by passing
the c-arrays unchanged to the fortran code (and the appropriately swapped
dimensions, without any need for copying or transposing the result). In
general, python should be as good with row vectors (1d vectors *are* row
vectors in python) as LAPACK with column vectors. In matlab A\b is more
efficient than b/A (the second is implemented as (A'\b')'), but both are
available. I am happy if python comes to expect row vectors if this makes
the interface code simpler and more efficient. (Sorry, I can't comment on
whether LAPACK requires contiguous arrays or not.)
Carlos
=================
MATRIX-SIG - SIG on Matrix Math for Python
send messages to: matrix-sig@python.org
administrivia to: matrix-sig-request@python.org
=================