[SciPy-User] scipy.optimize.root with method = 'lm' problem

William Heymann immudzen at gmail.com
Tue Dec 27 09:13:19 EST 2016


I am using scipy.optimize.root with method = 'lm' and jac=True and the
descent algorithm looks like it is taking steps that are far too large.

I have 10 variables and my goal function returns a vector of 4374 outputs
and 4374x10 for the jacobian. The initial jacobian matches the same one I
generated when I use lsqnonlin in matlab and that works just fine. Instead
with scipy I am getting no changes at all for the first two iterations and
then suddenly a huge jump to basically the upper end of the double range
which seems pretty extreme.

I would really like to make this work with scipy and I have my function
along with the exact derivatives for the jacobian computed with AD. I have
also looked at the singular values of the jacobian and they are all
positive and non-zero so the system should be locally convex at least.

This is the call I make to scipy and it seems reasonable

sol = scipy.optimize.root(residual, start, args=(large, small, weight,
data), method='lm', jac=True, options={'ftol':1e-6, 'xtol':1e-6})

Thank you
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-user/attachments/20161227/23f1f3c7/attachment.html>


More information about the SciPy-User mailing list