
I forgot to mention one thing: if you are doing optimization, a good solution is a modeling package like AMPL (or GAMS or AIMMS, but I only know AMPL, so I will restrict my attention to it). AMPL has a natural modeling language and provides you with automatic differentiation. It's not free, but there are trial licenses (60 days) and student a student edition (unlimited time, maximum of 300 variables). It is hooked to many great solvers, including KNITRO (commercial) and IPOPT (free), both for nonlinear programs. http://www.ampl.com And as this is a Python list, there is NLPy. I never tried it, but it seems to allow you to read model files written in AMPL, use AMPL's automatic differentiation capabilities, and still roll your own optimization algorithm, all in Python. It looks like it requires some other packages aside from Python, NumPy and AMPL, like a sparse linear solver and some other things. Worth taking a look. http://nlpy.sourceforge.net/how.html Best, Guilherme On Tue, May 4, 2010 at 5:23 PM, Guilherme P. de Freitas <guilherme@gpfreitas.com> wrote:
On Tue, May 4, 2010 at 2:57 PM, Sebastian Walter <sebastian.walter@gmail.com> wrote:
playing devil's advocate I'd say use Algorithmic Differentiation instead of finite differences ;) that would probably speed things up quite a lot.
I would suggest that too, but aside from FuncDesigner[0] (reference in the end), I couldn't find any Automatic Differentiation tool that was easy to install for Python.
To stay with simple solutions, I think that the "complex step" approximation gives you a very good compromise between ease of use, performance and accuracy. Here is an implementation (and you can take ideas from it to get rid of your "for" loops in your original code)
import numpy as np
def complex_step_grad(f, x, h=1.0e-20): dim = np.size(x) increments = np.identity(dim) * 1j * h partials = [f(x+ih).imag / h for ih in increments] return np.array(partials)
**Warning**: you must convert your original real-valued function f: R^n -> R to the corresponding complex function f: C^n -> C. Use functions from the 'cmath' module.
I strongly suggest that you take a look at the AutoDiff website[1] and at some references about the complex step [2][3] (or just Google "complex step" and "differentiation", both on normal Google and Google Scholar).
[0] http://openopt.org/FuncDesigner [1] http://www.autodiff.org/ [2] http://doi.acm.org/10.1145/838250.838251 [3] http://mdolab.utias.utoronto.ca/resources/complex-step
-- Guilherme P. de Freitas http://www.gpfreitas.com
-- Guilherme P. de Freitas http://www.gpfreitas.com