[Numpy-discussion] Dot/inner products with broadcasting?

Jaakko Luttinen jaakko.luttinen at aalto.fi
Wed Mar 13 10:21:13 EDT 2013


Hi!

How can I compute dot product (or similar multiply&sum operations)
efficiently so that broadcasting is utilized?
For multi-dimensional arrays, NumPy's inner and dot functions do not
match the leading axes and use broadcasting, but instead the result has
first the leading axes of the first input array and then the leading
axes of the second input array.

For instance, I would like to compute the following inner-product:
np.sum(A*B, axis=-1)

But numpy.inner gives:
A = np.random.randn(2,3,4)
B = np.random.randn(3,4)
np.inner(A,B).shape
# -> (2, 3, 3) instead of (2, 3)

Similarly for dot product, I would like to compute for instance:
np.sum(A[...,:,:,np.newaxis]*B[...,np.newaxis,:,:], axis=-2)

But numpy.dot gives:
In [12]: A = np.random.randn(2,3,4); B = np.random.randn(2,4,5)
In [13]: np.dot(A,B).shape
# -> (2, 3, 2, 5) instead of (2, 3, 5)

I could use einsum for these operations, but I'm not sure whether that's
as efficient as using some BLAS-supported(?) dot products.

I couldn't find any function which could perform this kind of
operations. NumPy's functions seem to either flatten the input arrays
(vdot, outer) or just use the axes of the input arrays separately (dot,
inner, tensordot).

Any help?

Best regards,
Jaakko



More information about the NumPy-Discussion mailing list