numerical gradient, Jacobian, and Hessian

Awhile back there were good signs that SciPy would end up with a `diff` module: https://github.com/scipy/scipy/issues/2035 Is this still moving forward? It would certainly be nice for SciPy to have intuitive numerical gradients, Jacobians, and Hessians. The last two are I think missing altogether. The first exists as scipy.optimize.approx_fprime. `approx_fprime` seems to work fine, but I suggest it has the following drawbacks: - it is hard to find (e.g., try doing a Google search on "scipy gradient" or "scipy numerical gradient" - related, it is in the wrong location (scipy.optimize) - the signature is odd: (x,f,dx) instead of (f,x,dx) (This matters for ease of recall and for teaching.) In any case, as I understand it, the author's of numdifftools http://code.google.com/p/numdifftools/ expressed willingness to have their code moved into SciPy. This seems like an excellent way forward. There was talk of making this a summer of code project, but that seems to have sputtered. Alan Isaac

I was going to suggest numdifftools; its a very capable package in my experience. Indeed it would be nice to have it integrated into scipy. Also, in case trying to calculate a numerical gradient is a case of 'the math getting too bothersome' rather than no closed form gradient actually existing: Theano may be your best bet; I have very good experiences with it as well. As far as I can tell, it is actually the only tensor/ndarray aware differentiator out there (maple and mathematica don't appear to support this) On Sun, Apr 20, 2014 at 4:55 PM, Alan G Isaac <alan.isaac@gmail.com> wrote:
Awhile back there were good signs that SciPy would end up with a `diff` module: https://github.com/scipy/scipy/issues/2035 Is this still moving forward?
It would certainly be nice for SciPy to have intuitive numerical gradients, Jacobians, and Hessians. The last two are I think missing altogether. The first exists as scipy.optimize.approx_fprime.
`approx_fprime` seems to work fine, but I suggest it has the following drawbacks: - it is hard to find (e.g., try doing a Google search on "scipy gradient" or "scipy numerical gradient" - related, it is in the wrong location (scipy.optimize) - the signature is odd: (x,f,dx) instead of (f,x,dx) (This matters for ease of recall and for teaching.)
In any case, as I understand it, the author's of numdifftools http://code.google.com/p/numdifftools/ expressed willingness to have their code moved into SciPy. This seems like an excellent way forward. There was talk of making this a summer of code project, but that seems to have sputtered.
Alan Isaac _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

On Mon, Apr 21, 2014 at 3:13 AM, Eelco Hoogendoorn <hoogendoorn.eelco@gmail.com> wrote:
As far as I can tell, [Theano] is actually the only tensor/ndarray aware differentiator out there
And AlgoPy, a tensor/ndarray aware arbitrary order automatic differentiator (https://pythonhosted.org/algopy/)

alex wrote:
On Mon, Apr 21, 2014 at 3:13 AM, Eelco Hoogendoorn <hoogendoorn.eelco@gmail.com> wrote:
As far as I can tell, [Theano] is actually the only tensor/ndarray aware differentiator out there
And AlgoPy, a tensor/ndarray aware arbitrary order automatic differentiator (https://pythonhosted.org/algopy/)
I noticed julia seems to have a package
participants (4)
-
Alan G Isaac
-
alex
-
Eelco Hoogendoorn
-
Neal Becker