For dot product I can convince myself this is a math definition thing and accept the conjugation. But for "vecmat" why the complex conjugate of the vector? Are we assuming that 1D things are always columns. I am also a bit lost on the difference of dot, vdot and vecdot.

Also if __matmul__ and np.matmul give different results, I think you will enjoy many fun tickets. Personally I would agree with them no matter what the reasoning was at the time of divergence.

On Tue, Jan 23, 2024 at 11:17 PM Marten van Kerkwijk <mhvk@astro.utoronto.ca> wrote:
Hi All,

I have a PR [1] that adds `np.matvec` and `np.vecmat` gufuncs for
matrix-vector and vector-matrix calculations, to add to plain
matrix-matrix multiplication with `np.matmul` and the inner vector
product with `np.vecdot`.  They call BLAS where possible for speed.
I'd like to hear whether these are good additions.

I also note that for complex numbers, `vecmat` is defined as `x†A`,
i.e., the complex conjugate of the vector is taken. This seems to be the
standard and is what we used for `vecdot` too (`x†x`). However, it is
*not* what `matmul` does for vector-matrix or indeed vector-vector
products (remember that those are possible only if the vector is
one-dimensional, i.e., not with a stack of vectors). I think this is a
bug in matmul, which I'm happy to fix. But I'm posting here in part to
get feedback on that.

Thanks!

Marten

[1] https://github.com/numpy/numpy/pull/25675

p.s. Separately, with these functions available, in principle these
could be used in `__matmul__` (and thus for `@`) and the specializations
in `np.matmul` removed. But that can be a separate PR (if it is wanted
at all).
_______________________________________________
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-leave@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: ilhanpolat@gmail.com