[Python-ideas] [RFC] draft PEP: Dedicated infix operators for matrix multiplication and matrix power

Oscar Benjamin oscar.j.benjamin at gmail.com
Sat Mar 15 13:20:42 CET 2014


On 15 March 2014 12:06, Antoine Pitrou <solipsis at pitrou.net> wrote:
>> > The possible reason given in the PEP is very weak and amounts to
>> > premature optimization:
>>
>> I don't think it's just a matter of optimization. Often,
>> matrix @ vector represents a linear operator acting on an
>> element of a vector space. When you chain them,
>>
>>     A @ B @ C @ v
>>
>> conceptually represents acting on v with C, then B,
>> then A.
>
> It can just as well represent "acting" on v with (A @ B @ C).
>
> Of course, mathematically it shouldn't make a difference, but in
> computer programming right-associative operators are always a special
> case, and therefore an additional cognitive burden.

I don't think it's a premature optimisation. It's a significant
algorithmic optimisation.

A @ (B @ (C @ v)))   # 3 matrix-vector multiplies
(((A @ B) @ C) @ v)  # 2 matrix-matrix multiplies and one matrix-vector multiply

If the matrices are NxN and the vector of length N then the
matrix-vector multiplications can be performed with the asymptotically
optimal N**2 operations but the matrix-matrix multiplications require
something like N**2.5 or worse.

It is possible but unusual for people to write this the other way
round (in which case the optimisation would favour
left-associativity). In any case many of the users of these operators
will not know the difference between left and right associativity and
will either use brackets or just write it out and hope that
Python/numpy know what to do.


Oscar


More information about the Python-ideas mailing list