[SciPy-User] optimization with ill conditioned Hessian

Charles R Harris charlesr.harris at gmail.com
Fri Oct 18 23:47:46 EDT 2013


On Fri, Oct 18, 2013 at 8:16 PM, <josef.pktd at gmail.com> wrote:

> Does scipy have another optimizer besides fmin (Nelder-Mead) that is
> robust to near-singular, high condition number Hessian?
>
> fmin_bfgs goes into neverland, values become huge until I get some
> nans in my calculations.
>
> What would be nice is an optimizer that uses derivatives, but
> regularizes, forces Hessian or equivalent to be positive definite.
>
>
> Background
> I'm trying to replicate a textbook example that has data and matrix
> inverses that are "not nice". fmin (Nelder-Mead) is getting pretty
> close to the Stata numbers. However fmin_bfgs has been my preferred
> default optimizer for some time.
>
> Aside:
> It looks like it's a good test case to make my linear algebra more robust.
> np.linalg.pinv(x.T.dot(x)) doesn't seem to be robust enough for this case.
> And no idea why a textbook would use an example like that.
> And no idea if Stata doesn't just make up the numbers.
>

This would be better if you used the svd decomposition of x and did your
own pinv. Might also try scaling the columns of x.

Unscaled, that would be

u*d*v = svd(x)
pinv(x.T.dot(x)) = v.T * (1/d**2) * v

where elements of 1/d**2  are set to zero if the corresponding term of d is
below a cutoff. Scaling adds diagonal terms at both ends of that.

Chuck
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.scipy.org/pipermail/scipy-user/attachments/20131018/9116a580/attachment.html>


More information about the SciPy-User mailing list