On Tue, May 4, 2010 at 2:57 PM, Sebastian Walter sebastian.walter@gmail.com wrote:

playing devil's advocate I'd say use Algorithmic Differentiation instead of finite differences ;) that would probably speed things up quite a lot.

I would suggest that too, but aside from FuncDesigner[0] (reference in the end), I couldn't find any Automatic Differentiation tool that was easy to install for Python.

To stay with simple solutions, I think that the "complex step" approximation gives you a very good compromise between ease of use, performance and accuracy. Here is an implementation (and you can take ideas from it to get rid of your "for" loops in your original code)

import numpy as np

def complex_step_grad(f, x, h=1.0e-20): dim = np.size(x) increments = np.identity(dim) * 1j * h partials = [f(x+ih).imag / h for ih in increments] return np.array(partials)

**Warning**: you must convert your original real-valued function f: R^n -> R to the corresponding complex function f: C^n -> C. Use functions from the 'cmath' module.

I strongly suggest that you take a look at the AutoDiff website[1] and at some references about the complex step [2][3] (or just Google "complex step" and "differentiation", both on normal Google and Google Scholar).

[0] http://openopt.org/FuncDesigner [1] http://www.autodiff.org/ [2] http://doi.acm.org/10.1145/838250.838251 [3] http://mdolab.utias.utoronto.ca/resources/complex-step