[Numpy-discussion] Optimized sum of squares

josef.pktd at gmail.com josef.pktd at gmail.com
Tue Oct 20 13:16:18 EDT 2009


On Sun, Oct 18, 2009 at 6:06 AM, Gary Ruben <gruben at bigpond.net.au> wrote:
> Hi Gaël,
>
> If you've got a 1D array/vector called "a", I think the normal idiom is
>
> np.dot(a,a)
>
> For the more general case, I think
> np.tensordot(a, a, axes=something_else)
> should do it, where you should be able to figure out something_else for
> your particular case.

Is it really possible to get the same as np.sum(a*a, axis)  with
tensordot  if a.ndim=2 ?
Any way I try the "something_else", I get extra terms as in np.dot(a.T, a)

Josef





>
> Gary R.
>
> Gael Varoquaux wrote:
>> On Sat, Oct 17, 2009 at 07:27:55PM -0400, josef.pktd at gmail.com wrote:
>>>>>> Why aren't you using logaddexp ufunc from numpy?
>>
>>>>> Maybe because it is difficult to find, it doesn't have its own docs entry.
>>
>> Speaking of which...
>>
>> I thought that there was a readily-written, optimized function (or ufunc)
>> in numpy or scipy that calculated the sum of squares for an array
>> (possibly along an axis). However, I cannot find it.
>>
>> Is there something similar? If not, it is not the end of the world, the
>> operation is trivial to write.
>>
>> Cheers,
>>
>> Gaël
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>



More information about the NumPy-Discussion mailing list