[Python-ideas] [RFC] draft PEP: Dedicated infix operators for matrix multiplication and matrix power
Sturla Molden
sturla.molden at gmail.com
Wed Mar 26 19:19:15 CET 2014
On 24/03/14 01:49, Steven D'Aprano wrote:
> No it isn't. The PEP even discusses it:
(...)
Well, even with a single dimension, matrix multiplication has a useful
interpretation (inner product). At least the intention in numpy is to
interpret "vector @ vector" as the inner product. Thus it is useful both
for list and array in the standard library. A possible usecase for it in
the standard library would be to simplify the code in the statistics module.
But the main consideration is to make @ benign to the rest of the Python
community.
> I don't think we should be looking for additional use-cases for @.
> Either the PEP stands on its own, and @ is approved for matrix
> multiplication (and possibly @@ for exponentiation),
The consensus on the NumPy list seems to be that @@ can wait. It is also
ambigous. For example matrix @@ 0.5 could be interpreted as the Cholesky
factorization or the matrix square root, depending on context.
Also an expression like Z = (X @@ -1) @ Y should for numerical stability
(and speed) be computed as "solve X @ Z = Y" e.g. with LU, SVD or QR
factorization. This would require calling a function like solve(X,Y).
Matlab actually has a special back-divide operator for this reason: X \ Y.
This means that @@ is ambiguous except for positive integer powers or
-1. But the common case of matrix @@ -1 is not as useful as we might
think. Then we are left with exponentiation to positive integer powers,
which is not nearly common enough to justify a special operator.
Sturla
More information about the Python-ideas
mailing list