
On Tue, May 4, 2010 at 8:23 PM, Guilherme P. de Freitas <guilherme@gpfreitas.com> wrote:
On Tue, May 4, 2010 at 2:57 PM, Sebastian Walter <sebastian.walter@gmail.com> wrote:
playing devil's advocate I'd say use Algorithmic Differentiation instead of finite differences ;) that would probably speed things up quite a lot.
I would suggest that too, but aside from FuncDesigner[0] (reference in the end), I couldn't find any Automatic Differentiation tool that was easy to install for Python.
To stay with simple solutions, I think that the "complex step" approximation gives you a very good compromise between ease of use, performance and accuracy. Here is an implementation (and you can take ideas from it to get rid of your "for" loops in your original code)
import numpy as np
def complex_step_grad(f, x, h=1.0e-20): dim = np.size(x) increments = np.identity(dim) * 1j * h partials = [f(x+ih).imag / h for ih in increments] return np.array(partials)
**Warning**: you must convert your original real-valued function f: R^n -> R to the corresponding complex function f: C^n -> C. Use functions from the 'cmath' module.
Interesting idea I tried it with some silly function that has special.gammaln and np.dot in it, and it seems to work without adjustments to the code. The precision is much better than simple forward differentiation, which I have only with 1e-5 accuracy. simple timing 40% slower than simple forward differentiation 50%-80% faster that forward and backward differentiation In [2] I didn't see anything about higher derivatives, so to get the Hessian I still had to do a finite difference (Jacobian) on the complex_step_grad. Even then the results look pretty good. Another recommendation especially to check whether the results are correct: http://pypi.python.org/pypi/Numdifftools/ is pure python, (optionally adaptive) finite difference method for Gradient, Jacobian and Hessian. And related: I was surprised that sympy knows what the analytical derivative of the gamma function is, and that the function is available in scipy.special, so one less reason not to use analytical derivatives. I will include complex_step_grad in the toolbox for Maximum Likelihood Estimation. Thanks for the information, Josef
I strongly suggest that you take a look at the AutoDiff website[1] and at some references about the complex step [2][3] (or just Google "complex step" and "differentiation", both on normal Google and Google Scholar).
[0] http://openopt.org/FuncDesigner [1] http://www.autodiff.org/ [2] http://doi.acm.org/10.1145/838250.838251 [3] http://mdolab.utias.utoronto.ca/resources/complex-step
-- Guilherme P. de Freitas http://www.gpfreitas.com _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion