On Sat, Mar 15, 2014 at 12:27:33PM +0100, Antoine Pitrou wrote:
The real question is why @ would be right-associative.
That's a good question, but I don't think that should be up to us to decide. Guido clearly stated that if it is more useful to define @ as right-associative, we shouldn't let the fact that most operators are left-associative get in the way of doing the right thing here. The only people who are in a position to decide that are the users of matrix multiplication. [...]
The possible reason given in the PEP is very weak and amounts to premature optimization:
I don't think it's premature optimization. Sometimes we do know ahead of time that a calculation done one way will be faster than doing it another way: you don't have to "try it and see" to realise that repeatedly adding strings in a for-loop can be O(N**2) versus O(N) for using str.join(). [Aside: if you do try it, the string-concat reference-counting optimization of CPython may fool you into thinking that concatenation is generally fast. It isn't.] Likewise there is nothing premature about the fact that "Matrix-vector multiplication is much cheaper than matrix-matrix multiplication". The only question is whether it is more common to write: Matrix @ Matrix @ Column_Vector or Row_Vector @ Matrix @ Matrix I'll leave it to those who do matrix maths to decide which they use more often, but personally I've never come across the second case except in schoolbook exercises.
"""It's been suggested that @ should be right-associative, on the grounds that for expressions like Mat @ Mat @ vec, the two different evaluation orders produce the same result, but the right-associative order Mat @ (Mat @ vec) will be faster and use less memory than the left-associative order (Mat @ Mat) @ vec. (Matrix-vector multiplication is much cheaper than matrix-matrix multiplication)."""
If that's the only reason, then I'd like @ to be left-associative.
I'm not sure that premature pessimation is much of an improvement over premature optimization. *wink* -- Steven