On 2014-03-15 12:06, Antoine Pitrou wrote:
On Sun, 16 Mar 2014 00:55:09 +1300 Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Antoine Pitrou wrote:
The possible reason given in the PEP is very weak and amounts to premature optimization:
I don't think it's just a matter of optimization. Often, matrix @ vector represents a linear operator acting on an element of a vector space. When you chain them,
A @ B @ C @ v
conceptually represents acting on v with C, then B, then A.
It can just as well represent "acting" on v with (A @ B @ C).
Of course, mathematically it shouldn't make a difference, but in computer programming right-associative operators are always a special case, and therefore an additional cognitive burden.
I think his point was that people doing linear algebra tend to read these expressions "right-to-left" anyways because of that conceptual model. I'm not entirely sure how supportable that is in the general population, but it's certainly the way I think about these expressions. For me, left-associative causes an additional cognitive burden, but it's a minor one, either way. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco