[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Michael McNeil Forbes mforbes at physics.ubc.ca
Sat Apr 14 15:45:23 EDT 2007


On 14 Apr 2007, at 8:59 AM, Matthieu Brucher wrote:

> Good point ;)
> I could make those changes safe for the f or func part... Using an  
> object to optimize is for me better than a collection of functions  
> although a collection of functions can be made into an object if  
> needed.
>
>
> For the interface, I suppose that assembling a optimizer is not  
> something everybody will want to do, that's why some optimizers are  
> proposed out of the box in MatLab toolboxes for instance, but  
> allowing to customize rapidly an optimizer can be a real advantage  
> over all other optimization packages.

And one can easily make convenience functions which take standard  
arguments and package them internally.  I think that the interface  
should be flexible enough to allow users to just call the optimizers  
with a few standard arguments like they are used to, but allow users  
to "build" more complicated/more customized optimizers as they need.   
Also, it would be nice if an optimizer could be "tuned" to a  
particular problem (i.e. have a piece of code that tries several  
algorithms and parameter values to see which is fastest.)

> One of the members of the lab I studying in said to me that he did  
> see if such modularization was pertinent. He used for its  
> application (warping an image) a Levenberg-Marquardt optimizer with  
> constraints and the line-search was performed with interval  
> analysis. Until some days ago, I thought that he was right, that  
> only some of optimizers can be expressed in "my" framework. Now, I  
> think that even his optimization could be expressed, and if he  
> wanted to modify something in the optimizer, it would be much  
> simpler with this architecture, in Python, that what he has now, in  
> C. He made some stuff very specific for his function, as a lot of  
> people would want to do, but couldn't with a fixed interface ike  
> MatLab's, but in fact a lot could be expressed in terms of a  
> specific step, a specific line search, a specific criterion and a  
> specific function/set of parameters.
>
> Until some time ago, I thought that modules with criteria, steps  
> and optimizers would be enough, now I think I missed the fact that  
> a lot of optimizers share the line search, and that it should be  
> onother module.

My immediate goal is to try and get the interface and module  
structure well defined so that I know where to put the pieces of my  
Broyden code when I rip it apart.

One question about coupling: Useful criteria for globally convergent  
algorithms include testing the gradients and/or curvature of the  
function.  In the Broyden algorithm, for example, these would be  
maintained by the "step" object, but the criteria object would need  
to access these.  Likewise, if a "function" object can compute its  
own derivatives, then the "ceriterion" object should access it from  
there.

Any ideas on how to deal with these couplings?  Perhaps the  
"function" object should maintain all the state (approximate  
jacobians etc.).

Michael.




More information about the SciPy-Dev mailing list