Different people work on different code and have different
experiences
here -- yours may or may be typical yours. Pauli did some quick
checks
on scikit-learn & nipy & scipy, and found that in their test
suites,
uses of np.dot and uses of elementwise-multiplication are
~equally
common:
https://github.com/numpy/numpy/pull/4351#issuecomment-37717330h
My impression from the other thread is that @@ probably won't end
up
existing, so you're safe here ;-).
I know; my point is that the same objections apply to @, albeit in
weaker form.
Einstein notation is coming up on its 100th birthday and is just
as
blackboard-friendly as matrix product notation. Yet there's still
a
huge number of domains where the matrix notation dominates. It's
cool
if you aren't one of the people who find it useful, but I don't
think
it's going anywhere soon.
Einstein notation is just as blackboard friendly; but also much
more computer-future proof. I am not saying matrix multiplication is going
anywhere soon; but as far as I can tell that is all inertia; historical
circumstance has not accidentially prepared it well for numerical needs, as far
as I can tell.
The analysis in the PEP found ~780 calls to np.dot, just in the
two
projects I happened to look at. @ will get tons of use in the
real
world. Maybe all those people who will be using it would be happier
if
they were using einsum instead, I dunno, but it's an argument
you'll
have to convince them of, not me :-).
780 calls is not tons of use, and these projects
are outliers id argue.
I just read for the first time two journal
articles in econometrics that use einsum notation.
I have no idea what their formulas are supposed
to mean, no sum signs and no matrix algebra.
If they could have been expressed more clearly
otherwise, of course this is what they should have done; but could they? b_i =
A_ij x_j isnt exactly hard to read, but if it was some form of complicated
product, its probably tensor notation was their best
bet.