On May 22, 2015 11:00 AM, "Alexander Belopolsky" <ndarray@mac.com> wrote:
>
>
> On Thu, May 21, 2015 at 9:37 PM, Nathaniel Smith <njs@pobox.com> wrote:
> >
> > .. there's been some discussion of the possibility of
>
> > adding specialized gufuncs for broadcasted vector-vector,
> > vector-matrix, matrix-vector multiplication, which wouldn't do the
> > magic vector promotion that dot and @ do.
>
>
> This would be nice.  What I would like to see is some consistency between multi-matrix
> support in linalg methods and dot.
>
> For example, when A is a matrix and b is a vector and
>
> a = linalg.solve(A, b)
>
> then
>
> dot(A, a) returns b, but if either or both A and b are stacks, this invariant does not hold.  I would like
> to see a function (say xdot) that I can use instead of dot and have xdot(A, a) return b whenever a = linalg.solve(A, b).

I believe this equivalence holds if xdot(x, y) = x @ y, because solve() does follow the pep 465 semantics for shape handling. Or at least, it's intended to. Of course we will also expose pep 465 matmul semantics under some name that doesn't require the new syntax (probably not "xdot" though ;-)).

> Similarly, if w,v =  linalg.eig(A), then dot(A,v) returns w * v, but only if A is 2d.

Again A @ v I believe does the right thing, though I'm not positive -- you might need a swapaxes or matvec or something. Let us know if you work it out :-).

Note that it still won't be equivalent to w * v because w * v doesn't broadcast the way you want :-). You need w[..., np.newaxis, :] * v, I think.

-n

>
>
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>