On 2014-03-14 17:53, Guido van Rossum wrote:
I have now read the PEP, and I think it's good. I think it's a waste of time to keep bikeshedding on the choice of operator -- @ is the best compromise. I do have a few specific notes:
- Right associativity is not unheard of in Python. E.g. **. If you think that for other reasons @ should be right associative, don't let Python's tradition stop you. But then you need to decide which of * and @ binds more tightly -- e.g. does a*b@c mean a*(b@c) or (a*b)@c? And if you choose the latter, it follows that a@b*c means a@(b*c) -- is that okay? (And similar examples exist for the other choice.)
I *think* either works out fine in practice, but I have a preference for @ binding tighter than *. `scalar * matrix @ vector` does fewer flops that way, and most current expressions are written with this binding anyways: `scalar * np.dot(matrix, vector)`. It just feels right to me.
- Did you consider a duck-typing (is that the word?) attribute?
Facade?
E.g. a*b is elementwise multiplication; a.M*b must be used for matrix multiplication. (Your use of .T as "transpose" made me think of this.) Of course the question is, can you get those packages that currently use * for matrix multiply to comply? (I don't consider this a serious counter-proposal. But you list a bunch of rejected alternatives; this could be in that list.
It seems to me that the left-associativity of * makes this less useful than a dedicated operator if we consider chains of matrix multiplications. A.M * B.M * C == (A.M * B.M) * C We probably could make the rule that if the right-operand is one of these matmul-facades, then the result should also be a matmul-facade, but otherwise would be a plain numpy array. That would take care of this case, but it's not obvious to me that it's unambiguously the right thing to do in all cases.
- Is @@ really necessary? It seems you are adding it mostly because it's cute and because of the parallel with **, not because it is actually important enough to add new syntax. And then later you use it as an argument for @, which seems a bit circular. Also, if we were to make @ right-associative, the parallel with ** is already imperfect.
In my personal experience, I would use a matrix power operator much less than a matrix multiplication operator. I, at least, am content to continue to use a function call for that. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco