[Matrix-SIG] Some optimization routines.

Andrey Kaliazin KaliazinA@cardiff.ac.uk
Wed, 14 Apr 1999 18:59:49 +0100


Hi  Travis,
you are doing a great job!

I've spent some time on these problems with many variables.
From 1 to ~200 (I'd like more, but my PC became too slow :)

When number of variables exceed 10-20, depending on a problem,
there is NO one best subroutine, definitely, and powerful second-order ones,
like BFGS, Newton-Raphson, etc, became less and less useful.
They are only good near the local extremum but you have to find it first.
Modern techniques for many-variables optimization are
1. Simulated Annealing (SA)
ASA <http://www.ingber.com/#ASA-INFO>
or SIMANN - both are in the OPT pack from NETLIB,
2. Genetic Algorithms (GA) - look for example at
<http://www-xdiv.lanl.gov/XCM/research/genalg/ga.html>
3. Of course general and specific N-dimensional Monte-Carlo methods.

Andy


----- Original Message -----
From: Ryszard Czerminski <ryszard@moldyn.com>
To: Travis Oliphant <Oliphant.Travis@mayo.edu>
Cc: <matrix-sig@python.org>


>
> As you pointed out performance and applicability of
> various optimization routines is very much problem
> dependent, therefore you need to look for "best" algorithms
> (and implementations!) in different categories.
>
> I comment here only on unconstrained, local minimization
> problems:
>
> general non-linear functions:
>
> (1) derivatives not available - SIMPLEX and conjugate
>     direction methods (but I do not know about their relative merit)
> (2) first derivatives available - conjugate gradient
>     (Polak-Ribiere variant)
> (3) first and second derivatives available - Newton-Raphson
>
> for least-square problems methods which exploit explicitly
> structure of the objective function are more effective
> e.g. Levenberg-Marquardt method
>
> Above list is probably not very complete - even only
> for unconstrained, local minimization problems.
>
> There is a very nice overview in
> http://www-fp.mcs.anl.gov/otc/Guide/OptWeb.
>
> Ryszard Czerminski         phone : (617)354-3124 x 10
> Moldyn, Inc.               fax   : (617)491-4522
> 955 Massachusetts Avenue   e-mail: ryszard@moldyn.com
> Cambridge MA, 02139-3180   http://www.moldyn.com
>
> On Wed, 14 Apr 1999, Travis Oliphant wrote:
>
> >
> > A while back there was talk about optimization routines.  I just
> > downloaded a lot of them from the subdirectory /opt on netlib
> > (constrained, unconstrained, nonlinear ...) I would like to include an
> > appropriate set of these (eventually) in the multipack module that I'm
> > constructing.
> >
> > I'd like to start with one that minimizes a function of N variables and
> > am interested to hear what people on this list have to say about the
> > appropriateness of one algorithm versus another.  I know that's a hard
> > question given that optimization is so problem-dependent.  That's why I
> > eventually want to add a number of them.  So, what are the "good" ones.
> >
> > Regards,
> >
> > Travis
> >
> >
> >
> > _______________________________________________
> > Matrix-SIG maillist  -  Matrix-SIG@python.org
> > http://www.python.org/mailman/listinfo/matrix-sig
> >
>
>
> _______________________________________________
> Matrix-SIG maillist  -  Matrix-SIG@python.org
> http://www.python.org/mailman/listinfo/matrix-sig
>