On Fri, Nov 11, 2016 at 6:57 PM, Scott Sievert <sievert.scott@gmail.com> wrote:
There’s also scipy.misc.derivative which also finds the derivative of a function.

There’s not only the method of finite differences, but there’s also automatic differentiation  In this, you keep track of what operations are performed on the input, then use the chain rule to find the derivative. Because it only keeps track of the functions performed, this method support if-statments, while-loops and recursion.

Here’s a package that implements this method with a thin wrapper around NumPy: https://github.com/HIPS/autograd (though it looks like numdifftools also support this)

Numerical Optimization 2nd edition by Stephen Wright also has more detail on chapter 8 methods to compute derivatives (and chapter 9 is on derivative free optimization!).

Scott

On November 11, 2016 at 5:04:12 PM, per.brodtkorb@ffi.no (per.brodtkorb@ffi.no) wrote:

Maybe this addresses Robert's needs:

http://www.scholarpedia.org/article/Finite_difference_method#FD_formulas_in_higher-D

https://github.com/pbrod/numdifftools/blob/master/numdifftools/fornberg.py

Per A.

-----Original Message-----
From: SciPy-Dev [mailto:scipy-dev-bounces@scipy.org] On Behalf Of Jonathan Stickel
Sent: 10. november 2016 18:32
To: scipy-dev@scipy.org
Subject: Re: [SciPy-Dev] Differentiate function

On 11/10/16 01:19 , Thomas Haslwanter wrote:
> The current discussion lacks a reference to the existing
> Savitzky-Golay filter
> https://scipy.github.io/devdocs/generated/scipy.signal.savgol_filter.h
> tml which - to my understanding - should solves most of Robert's
> problems.
>
> thomas
>


No, I don't think this addresses Robert's needs. That is simply a data smoother (and arguably inferior to other data-smoothing methods).
Although it does have an option to provide a derivative, it presumes the data are equally spaced.


> On Thu, Nov 10, 2016 at 8:10 AM, Ralf Gommers <ralf.gommers@gmail.com
> <mailto:ralf.gommers@gmail.com>> wrote:
>
>
>
> On Wed, Nov 9, 2016 at 8:01 AM, Pauli Virtanen <pav@iki.fi
> <mailto:pav@iki.fi>> wrote:
>
> Mon, 07 Nov 2016 19:52:09 +0300, Evgeni Burovski kirjoitti:
> > Note that `approx_derivative` implements several finite-difference
> > schemes,
>
> In addition, I'd remind of
>
> https://pypi.python.org/pypi/Numdifftools
> <https://pypi.python.org/pypi/Numdifftools>
>
>
> And
> https://github.com/scipy/scipy/wiki/Proposal:-add-finite-difference-numerical-derivatives-as-scipy.diff
>
> <https://github.com/scipy/scipy/wiki/Proposal:-add-finite-difference-n
> umerical-derivatives-as-scipy.diff>
>
> Ralf


These are tools for finite-differences of a known function. Robert (and
I) are interested in finite-differences of y vs. x vectors, whether obtained from experiment or as part of a higher-level numerical method.
_______________________________________________
SciPy-Dev mailing list
SciPy-Dev@scipy.org
https://mail.scipy.org/mailman/listinfo/scipy-dev
_______________________________________________
SciPy-Dev mailing list
SciPy-Dev@scipy.org
https://mail.scipy.org/mailman/listinfo/scipy-dev

_______________________________________________
SciPy-Dev mailing list
SciPy-Dev@scipy.org
https://mail.scipy.org/mailman/listinfo/scipy-dev


To throw one more in here

local polynomial regression (local kernel or windows) would be another way. AFAIR, it would be similar to Savitsky-Golay for non-equal spaced grid. IIRC, the slope coefficient of the local linear regression is the derivative of the smooth function and has relatively good edge behavior because of the local linear structure. 
I think higher order derivatives would follow from the coefficients of higher order polynomial regression.

That would be the kernel regression analog of smoothing splines version.

Josef