[Numpy-discussion] Changed behavior of np.gradient
njs at pobox.com
Fri Oct 17 10:11:58 EDT 2014
On 17 Oct 2014 02:38, "Benjamin Root" <ben.root at ou.edu> wrote:
> That isn't what I meant. Higher order doesn't "necessarily" mean more
accurate. The results simply have different properties. The user needs to
choose the differentiation order that they need. One interesting effect in
data assimilation/modeling is that even-order differentiation can often
have detrimental effects while higher odd order differentiation are better,
but it is highly dependent upon the model.
To be clear, we aren't talking about different degrees of differentiation,
we're talking about different approximations to the first derivative. I
just looked up the original pull request and it contains a pretty
convincing graph in which the old code has large systematic errors and the
new code doesn't:
I think the claim is that the old code had approximation error that grows
like O(1/n), and the new code has errors like O(1/n**2). (Don't ask me what
n is though.)
> This change in gradient broke a unit test in matplotlib (for a new
feature, so it isn't *that* critical). We didn't notice it at first because
we weren't testing numpy 1.9 at the time. I want the feature (I have need
for it elsewhere), but I don't want the change in default behavior.
You say it's bad, the original poster says it's good, how are we poor
maintainers to know what to do? :-) Can you say any more about why you you
prefer so-called lower accuracy approximations here by default?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the NumPy-Discussion