[Numpy-discussion] N dimensional dichotomy optimization

Gael Varoquaux gael.varoquaux at normalesup.org
Mon Nov 22 16:57:35 EST 2010


On Mon, Nov 22, 2010 at 09:12:45PM +0100, Matthieu Brucher wrote:
> Hi ;)

Hi bro

> > does anybody have, or knows where I can find some N dimensional
> > dichotomy optimization code in Python (BSD licensed, or equivalent)?

> I don't know any code, but it should be too difficult by bgoing
> through a KdTree.

I am not in terribly high-dimensional spaces, so I don't really need to
use a KdTree (but we do happen to have a nice BallTree available in the
scikit-learn, so I could use it just to  play :>). 

> > Worst case, it does not look too bad to code, but I am interested by
> > any advice. I haven't done my reading yet, and I don't know how
> > ill-posed a problem it is. I had in mind starting from a set of
> > points and iterating the computation of the objective function's
> > value at the barycenters of these points, and updating this list of
> > points. This does raise a few questions on what are the best possible
> > updates.

> In this case, you may want to check Nelder-Mead algotihm (also known
> as down-hill simplex or polytope), which is available in
> scikits.optimization, but there are other implementations out there.

Interesting reference. I had never looked at the Nelder-Mead algorithm.
I am wondering if it does what I want, thought. 

The reason I am looking at dichotomy optimization is that the objective
function that I want to optimize has local roughness, but is globally
pretty much a bell-shaped curve. Dichotomy looks like it will get quite
close to the top of the curve (and I have been told that it works well on
such problems). One thing that is nice with dichotomy for my needs is
that it is not based on a gradient, and it works in a convex of the
parameter space.

Will the Nelder-Mead display such properties? It seems so to me, but I
don't trust my quick read through of Wikipedia.

I realize that maybe I should rephrase my question to try and draw more
out of the common wealth of knowledge on this mailing list: what do
people suggest to tackle this problem? Guided by Matthieu's suggestion, I
have started looking at Powell's algorithm, and given the introduction of
www.damtp.cam.ac.uk/user/na/NA_papers/NA2007_03.pdf I am wondering
whether I should not investigate it. Can people provide any insights on
these problems.

Many thanks,

Gael

PS: The reason I am looking at this optimization problem is that I got
tired of looking at grid searches optimize the cross-validation score on 
my 3-parameter estimator (not in the scikit-learn, because it is way too
specific to my problems).



More information about the NumPy-Discussion mailing list