Does scipy provide any functions to calculate the Jacobian of a function? I noticed that minpack (used by scipy.optimize) approximates a Jacobian numerically, but scipy doesn't seem to provide a wrapper for these minpack functions. Any ideas? Thanks, -Tony
Fri, 14 Nov 2008 13:33:45 -0500, Tony S Yu wrote:
Does scipy provide any functions to calculate the Jacobian of a function? I noticed that minpack (used by scipy.optimize) approximates a Jacobian numerically, but scipy doesn't seem to provide a wrapper for these minpack functions. Any ideas?
If you're happy with naive differentiation (as you may well be -- optimization and root finding typically isn't too picky), something like this may work: {{{ import numpy as np def jacobian(f, u, eps=1e-6): """Evaluate partial derivatives of f(u) numerically. :note: This routine is currently naive and could be improved. :returns: (*f.shape, *u.shape) array ``df``, where df[i,j] ~= (d f_i / u_j)(u) """ f0 = _N.asarray(f(u)) # asarray: because of matrices u_shape = u.shape nu = np.prod(u_shape) f_shape = f0.shape nf = np.prod(f_shape) df = np.empty([nf, nu]) for k in range(nu): du = np.zeros(nu) du[k] = max(eps*abs(u.flat[k]), eps) f1 = np.asarray(f(u + _N.reshape(du, u_shape))) df[:,k] = np.reshape((f1 - f0) / eps, [nf]) df.shape = f_shape + u_shape return df }}} Requires the function be vectorized. -- Pauli Virtanen
When I added scipy.optimize.fmin_slsqp for Scipy 0.70.0 I included a function approx_jacobian that just does a basic forward finite difference like approx_fprime. http://www.scipy.org/scipy/scipy/attachment/ticket/570/slsqp.py If you don't yet have 0.7.0, heres the code: _epsilon = sqrt(finfo(float).eps) def approx_jacobian(x,func,epsilon,*args): """Approximate the Jacobian matrix of callable function func * Parameters x - The state vector at which the Jacobian matrix is desired func - A vector-valued function of the form f(x,*args) epsilon - The peturbation used to determine the partial derivatives *args - Additional arguments passed to func * Returns An array of dimensions (lenf, lenx) where lenf is the length of the outputs of func, and lenx is the number of * Notes The approximation is done using forward differences """ x0 = asfarray(x) f0 = func(*((x0,)+args)) jac = zeros([len(x0),len(f0)]) dx = zeros(len(x0)) for i in range(len(x0)): dx[i] = epsilon jac[i] = (func(*((x0+dx,)+args)) - f0)/epsilon dx[i] = 0.0 return jac.transpose() On Sat, Nov 15, 2008 at 8:17 AM, Pauli Virtanen <pav@iki.fi> wrote:
Fri, 14 Nov 2008 13:33:45 -0500, Tony S Yu wrote:
Does scipy provide any functions to calculate the Jacobian of a function? I noticed that minpack (used by scipy.optimize) approximates a Jacobian numerically, but scipy doesn't seem to provide a wrapper for these minpack functions. Any ideas?
If you're happy with naive differentiation (as you may well be -- optimization and root finding typically isn't too picky), something like this may work:
{{{ import numpy as np
def jacobian(f, u, eps=1e-6): """Evaluate partial derivatives of f(u) numerically.
:note: This routine is currently naive and could be improved.
:returns: (*f.shape, *u.shape) array ``df``, where df[i,j] ~= (d f_i / u_j)(u) """ f0 = _N.asarray(f(u)) # asarray: because of matrices
u_shape = u.shape nu = np.prod(u_shape)
f_shape = f0.shape nf = np.prod(f_shape)
df = np.empty([nf, nu])
for k in range(nu): du = np.zeros(nu) du[k] = max(eps*abs(u.flat[k]), eps) f1 = np.asarray(f(u + _N.reshape(du, u_shape))) df[:,k] = np.reshape((f1 - f0) / eps, [nf])
df.shape = f_shape + u_shape return df }}}
Requires the function be vectorized.
-- Pauli Virtanen
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
-- - Rob Falck
participants (3)
-
Pauli Virtanen
-
Rob Falck
-
Tony S Yu