[Numpy-discussion] Adding to the non-dispatched implementation of NumPy methods
shoyer at gmail.com
Thu Apr 25 19:02:07 EDT 2019
On Thu, Apr 25, 2019 at 3:39 PM Ralf Gommers <ralf.gommers at gmail.com> wrote:
> On Fri, Apr 26, 2019 at 12:04 AM Stephan Hoyer <shoyer at gmail.com> wrote:
>> I do like the look of this, but keep in mind that there is a downside to
>> exposing the implementation of NumPy functions -- now the implementation
>> details become part of NumPy's API. I suspect we do not want to commit
>> ourselves to never changing the implementation of NumPy functions, so at
>> the least this will need careful disclaimers about non-guarantees of
>> backwards compatibility.
> I honestly still am missing the point of claiming this. There is no change
> either way to what we've done for the last decade. If we change anything in
> the numpy implementation of any function, we use deprecation warnings etc.
> What am I missing here?
Hypothetically, wuppose we rewrite np.stack() in terms of np.block()
instead of np.concatenate(), because it turns out it is faster.
As long as we've coercing with np.asarray(), users don't notice any
material difference -- their code just gets a little faster.
But this could be problematic if we support duck typing. For example, I
support dask arrays rely on NumPy's definition of np.stack in terms of
np.concatenate, but they never bothered to implement np.block. Now
upgrading NumPy breaks dask.
This is basically the same reason why subclass support has been hard to
maintain in NumPy. Apparently safe internal changes to NumPy functions can
break other array types in surprising ways, even if they do not
intentionally deviate from NumPy's semantics.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the NumPy-Discussion