[Numpy-discussion] Dot/inner products with broadcasting?
jaakko.luttinen at aalto.fi
Thu Mar 14 07:54:06 EDT 2013
Answering to myself, this pull request seems to implement an inner
product with broadcasting (inner1d) and many other useful functions:
On 03/13/2013 04:21 PM, Jaakko Luttinen wrote:
> How can I compute dot product (or similar multiply&sum operations)
> efficiently so that broadcasting is utilized?
> For multi-dimensional arrays, NumPy's inner and dot functions do not
> match the leading axes and use broadcasting, but instead the result has
> first the leading axes of the first input array and then the leading
> axes of the second input array.
> For instance, I would like to compute the following inner-product:
> np.sum(A*B, axis=-1)
> But numpy.inner gives:
> A = np.random.randn(2,3,4)
> B = np.random.randn(3,4)
> # -> (2, 3, 3) instead of (2, 3)
> Similarly for dot product, I would like to compute for instance:
> np.sum(A[...,:,:,np.newaxis]*B[...,np.newaxis,:,:], axis=-2)
> But numpy.dot gives:
> In : A = np.random.randn(2,3,4); B = np.random.randn(2,4,5)
> In : np.dot(A,B).shape
> # -> (2, 3, 2, 5) instead of (2, 3, 5)
> I could use einsum for these operations, but I'm not sure whether that's
> as efficient as using some BLAS-supported(?) dot products.
> I couldn't find any function which could perform this kind of
> operations. NumPy's functions seem to either flatten the input arrays
> (vdot, outer) or just use the axes of the input arrays separately (dot,
> inner, tensordot).
> Any help?
> Best regards,
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
More information about the NumPy-Discussion