[Numpy-discussion] Add a function to broadcast arrays to a given shape to numpy's stride_tricks?
Jaime Fernández del Río
jaime.frio at gmail.com
Fri Dec 12 08:48:03 EST 2014
On Thu, Dec 11, 2014 at 10:53 AM, Stephan Hoyer <shoyer at gmail.com> wrote:
> On Thu, Dec 11, 2014 at 8:17 AM, Sebastian Berg <
> sebastian at sipsolutions.net> wrote:
>> One option
>> would also be to have something like:
>> np.broadcast_to(array, shape)
>> # (though I would like many arrays too)
>> and then broadcast_ar rays could be implemented in terms of these two.
> It looks like np.broadcast let's us write the common_shape function very
> def common_shape(*args):
> return np.broadcast(*args).shape
> And it's also very fast:
> 1000000 loops, best of 3: 1.04 µs per loop
> So that does seem like a feasible refactor/simplification for
> Sebastian -- if you're up for writing np.broadcast_to in C, that's great!
> If you're not sure if you'll be able to get around to that in the near
> future, I'll submit my PR with a Python implementation (which will have
> tests that will be useful in any case).
np.broadcast is the Python object of the old iterator. It may be a better
idea to write all of these functions using the new one, np.nditer:
return np.nditer(args).shape[::-1] # Yes, you do need to reverse it!
And in writing 'broadcast_to', rather than rewriting the broadcasting
logic, you could check the compatibility of the shape with something like:
np.nditer((arr,), itershape=shape) # will raise ValueError if shapes
After that, all that would be left is some prepending of zero strides, and
some zeroing of strides of shape 1 dimensions before calling as_strided
( > <) Este es Conejo. Copia a Conejo en tu firma y ayúdale en sus planes
de dominación mundial.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the NumPy-Discussion