[SciPy-dev] Derivative-based nonlinear optimization with linear ineqality constraints

dmitrey openopt at ukr.net
Tue May 22 02:08:09 EDT 2007


Bill Baxter wrote:
> There don't seem to be any optimizers in SciPy that have all of these
> attributes:
> 1) nonlinear objective function
> 2) uses analytical derivatives
> 3) allows (non-)linear (in)equality constraints
>   
So your message differs very much from topic "Derivative-based nonlinear 
optimization with linear    ineqality constraints"
> Did I just miss it?  The only things I see are methods that can only
> handle simple bounds, or don't make use of derivative info.
>   
It seems to me I had seen in scipy.optimize solver(s) that allow(es) to 
handle nonlinear inequality constraints (with derives), but not 
equality. There are very small % of NLP solvers that are able to handle 
equality constraints. Also note please that many SQP/rSQP solvers can 
handle equality, but not inequality constraints, so amount of NLP 
solvers capable of handling both eq & ineq is very small. Usually it is 
based in method of linearization. Afaik there is free IPopt (but it has 
GPL license) from COIN-OR project, written in C++, and some commercial, 
for example KNitro. There was a Ukrainean academician Pshenichniy, who 
had spend dozens of years to research in linearization-based 
optimization methods, but some months ago he died.
BTW in his book "Linearization method" (1983) he says very important 
observation:
"multiyears practice of solving optimization problems made specialists 
think, that there can't be universal NLP algorithm, that successfully 
enough solves ALL problems" (and that he is agree with the statement, as 
well as I do). I would explain the statement with my own args, but it 
requires much English-written text, so let me omit the one here.
And some lines below Pshenichniy continue:
"Practical experience shows, that there are a sets of problems, where an 
algorithm, that seems to be the most ineffective from the theoretical 
point of view, turns out to yield good results, due to specific of 
problem. Hence, a number of different algorithms is required."
An alternative approach (to constructing a single universal solver, like 
some do) can be like TOMLAB (or like in my OpenOpt): assignment of 
problem to single object + trying different solvers. And, of course, 
observing convergence pictures, because just single output <f_opt, 
x_opt, time_elapsed> can be very uninformative in many cases.

Regards, D

> BFGS isn't that difficult to adapt to accept linear constraints using
> the active set approach and lagrange multipliers.
>
> I have a BFGS routine I've been using for a while based on code from the book:
>   //    Optimization of Dynamic Systems
>   //    by Sunil Agrawal and Brian Fabien
>   //    http://abs-5.me.washington.edu/pub/fabien/book/code.tar.gz
>
> The code seems to be AWOL now, but I'm sure I have a copy I could dig
> up if anyone's interested.  Or there's my C++ adaptation of it, that I
> do have handy.
>
> Given a working BFGS implementation, the basic idea with active set is
> simple though. Either you're pinned against a constraint, and treating
> it as an equality constraint (active), or the constraint is off
> (inactive).  If your line search leads you away from an active
> constraint, you make it inactive.  If it takes you up against an
> inactive constraint, make it active.  The active constraints can be
> handled using Lagrange multipliers.
>
>
> --bb
> _______________________________________________
> Scipy-dev mailing list
> Scipy-dev at scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-dev
>
>
>
>   




More information about the SciPy-Dev mailing list