[Numpy-discussion] N dimensional dichotomy optimization

Paul Anton Letnes paul.anton.letnes at gmail.com
Sun Nov 28 03:46:16 EST 2010


> 
> 
> On Tue, Nov 23, 2010 at 3:37 AM, Sebastian Walter <sebastian.walter at gmail.com> wrote:
> On Tue, Nov 23, 2010 at 11:17 AM, Gael Varoquaux
> <gael.varoquaux at normalesup.org> wrote:
> > On Tue, Nov 23, 2010 at 11:13:23AM +0100, Sebastian Walter wrote:
> >> I'm not familiar with dichotomy optimization.
> >> Several techniques have been proposed to solve the problem: genetic
> >> algorithms, simulated annealing, Nelder-Mead and Powell.
> >> To be honest, I find it quite confusing that these algorithms are
> >> named in the same breath.
> >
> > I am confused too. But that stems from my lack of knowledge in
> > optimization.
> >
> >> Do you have a continuous or a discrete problem?
> >
> > Both.

I would like to advertise a bit for genetic algorithms. In my experience, they seem to be the most powerful of the optimization techniques mentioned here. In particular, they are good at getting out of local minima, and don't really care if you are talking about integer or continuous problems. As long as you can think of a good representation and good genetic operators, you should be good! I have just a little experience with pyevolve myself, but it is really easy to test GAs with pyevolve, as you just have to make a few settings to get going.

One word of warning: GA performance is very sensitive to the actual parameters you choose! Especially, you should think about mutation rates, crossover rates, selection protocols, and number of crossover points. (This list came off the top of my head...)

If you have any GA questions, ask, and perhaps I can come up with an answer.

Paul.


More information about the NumPy-Discussion mailing list