[Matrix-SIG] Some optimization routines.
Wed, 14 Apr 1999 10:00:51 -0400 (EDT)
As you pointed out performance and applicability of
various optimization routines is very much problem
dependent, therefore you need to look for "best" algorithms
(and implementations!) in different categories.
I comment here only on unconstrained, local minimization
general non-linear functions:
(1) derivatives not available - SIMPLEX and conjugate
direction methods (but I do not know about their relative merit)
(2) first derivatives available - conjugate gradient
(3) first and second derivatives available - Newton-Raphson
for least-square problems methods which exploit explicitely
structure of the objective function are more effective
e.g. Levenberg-Marquardt method
Above list is probably not very complete - even only
for unconstrained, local minimization problems.
There is a very nice overview in
Ryszard Czerminski phone : (617)354-3124 x 10
Moldyn, Inc. fax : (617)491-4522
955 Massachusetts Avenue e-mail: email@example.com
Cambridge MA, 02139-3180 http://www.moldyn.com
On Wed, 14 Apr 1999, Travis Oliphant wrote:
> A while back there was talk about optimization routines. I just
> downloaded a lot of them from the subdirectory /opt on netlib
> (constrained, unconstrained, nonlinear ...) I would like to include an
> appropriate set of these (eventually) in the multipack module that I'm
> I'd like to start with one that minimizes a function of N variables and
> am interested to hear what people on this list have to say about the
> appropriateness of one algorithm versus another. I know that's a hard
> question given that optimization is so problem-dependent. That's why I
> eventually want to add a number of them. So, what are the "good" ones.
> Matrix-SIG maillist - Matrix-SIG@python.org