[Matrix-SIG] Nonlinear optimization routines anyone?
Paul F. Dubois
Mon, 15 Mar 1999 12:02:57 -0800
The conjugate gradient algorithm is probably about twenty lines or less of
matrix/vector statements in Python, assuming you have a preconditioner you
can express that way. So just code it up in Python. It will be fast enough,
all the hard work is in the dot products and matrix multiplies.
From: David Ascher <firstname.lastname@example.org>
To: Janne Sinkkonen <email@example.com>
Cc: firstname.lastname@example.org <email@example.com>
Date: Monday, March 15, 1999 9:46 AM
Subject: Re: [Matrix-SIG] Nonlinear optimization routines anyone?
>> does anybody has neatly packaged nonlinear optimization routines
>> implemented in NumPy? I'd like to have either conjugate gradients or
>> BFGS. Explicitly calculating the Hessian is out of question (because
>> of the size and complexity of the problem).
>> I thought translating part of the matlab codes of C. T. Kelley
>> (http://www4.ncsu.edu/eos/users/c/ctkelley/www/matlab_darts.html) to
>> Numerical Python unless something already implemented emerges.
>I've never found a routine to package, and I've looked a fair bit. There's
>COOOL (http://timna.mines.edu/cwpcodes/coool/) which looked interesting,
>but it seemed extremely non-portable (couldn't get it to compile on Win32,
>requires expect, etc. etc.).
>Matrix-SIG maillist - Matrix-SIG@python.org