[SciPy-Dev] Stochastic branch and bound optimization algorithm
Benoit Rosa
b.rosa at unistra.fr
Fri Jun 9 10:46:07 EDT 2017
Hi all,
I recently implemented a stochastic branch and bound algorithm in python
for one of my projects (namely, 3D registration without an initial
guess, requiring global optimization over the SE3 group).
After trying basinhopping without much success, I went for a modified
stochastic branch and bound algorithm, described in those two papers
(disclaimer: I'm an author in one of those)
[1] C. Papazov and D. Burschka, Stochastic global optimization for
robust point set registration, Comput. Vis. Image Understand. 115 (12)
(2011) 1598-1609
[2] C. Gruijthuijsen, B. Rosa, P.T. Tran, J. Vander Sloten, E. Vander
Poorten, D. Reynaerts, An automatic registration method for
radiation-free catheter navigation guidance, J. Med. Robot Res. 1 (03)
(2016), 1640009
Since I wanted to compare with other optimization algorithms, I
implemented things by modifying the basinhopping file, and as such my
implementation (git commit here:
https://github.com/benoitrosa/scipy/commit/49a2c23b74b69dc4250e20e21db75bd071dfd92d
) is fully compatible with Scipy already. I have added a bit of
documentation in the file too. If you want an idea, on my project (for
which I can't share the code now), this algorithm finds the global
optimum over the S03 space (i.e. rotations) in 4.5 seconds on average,
where basinhopping takes more than 15 seconds, and doesn't necessarily
converge to the correct solution.
More about the algorithm itself: it's a common branch and bound
algorithm (i.e. recursive traversal of a tree and bisecting of leaves to
find an optimum), with two additions:
1 - The tree is traversed stochastically. This is governed by a
temperature parameter, much like simulated annealing algorithms. At each
node, there is a probability of going towards a "less promising"
(understand: cost function higher) branch, this probability being
governed by the temperature parameter. In the beginning, "bad" branches
will be selected, ensuring an exploration of large portions of the
space. As the number of iterations increases the temperature decreases,
and the algorithms goes more towards the most promising branches/leaves,
getting higher resolution in that part of the space.
2 - When creating a new leaf and evaluating the cost function
there, the value is propagated down the tree, until reaching the root or
a node with a lower cost. This ensures a smart traversal of the tree.
Moreover, it also makes sure that at any time, the root node has the
lowest cost, making it easy to query for the best solution when it is
interrupted.
Another advantage of this algorithm is that parameter tuning is pretty
easy. It only needs the bounds of the search space (no initial guess), a
sampling function (default is uniform) and a cost function (obviously).
Additional parameters are the number of iterations, and the start and
stop temperatures (start should be high to accept many "bad solutions"
in the beginning, and stop should be tending to zero).
If that algorithm is of interest to the SciPy community, I'd be happy to
clean up the code and make a pull request !
Best,
Benoit
More information about the SciPy-Dev
mailing list