[SciPy-Dev] feedback on adding global optimiser methods to `minimize`
Stefan Endres
stefan.c.endres at gmail.com
Sun Sep 8 16:15:09 EDT 2019
I agree that a single unified function for all the routines would be neater
since all the algorithms have black-box functions as input anyway. I
believe that any extra optional arguments to the global optimisation
functions not already defined by `minimize` can be handled with the
`options` dictionary object passed to the `minimize` function
One additional thought is the possibility of adding a unified `Minimizer`
class to scipy.optimize. There are two reasons for this:
(i) Manual control of iterations and/or memoization of progress. For
example, sometimes the stopping criteria for a global optimisation problem
is unknown and the optimisation practitioner would like to continue
iterations according to some external criteria not definable in the
algorithm itself (this is something that is often requested by `shgo` users
to me through e-mail correspondence). Another example is with objective
functions that have very long evaluations and the practitioner would like
save the progress possibly including all evaluations/current best
solution(s) and then continue the routine.
(ii) More importantly to skip a few initiation steps in algorithm selection
and initiation. This is useful when the function is called thousands to
millions of times and only a few arguments/parameters need to be changed
while the algorithm selection etc. remain unchanged. I'm not sure if this
will have a significant effect for the local routines in `minimize` though.
On Sun, Sep 8, 2019 at 5:57 AM Andrew Nelson <andyfaff at gmail.com> wrote:
> I'd like to gauge the support for adding the global minimizers
> (dual_annealing, shgo, differential_evolution, basinhopping) as new
> `minimize` methods.
>
> The problems the the 'local' and 'global' optimizers are trying to solve
> are very similar and both are specified in the same way, so my thought is
> that it would be nice to access them via the same unified interface that
> `minimize` offers (but not deprecating the `shgo` function, etc).
>
> It's important that the users are able to understand the distinction
> between local and global optimisation and how they go about finding a
> minimum. I'm hoping that this could be made plain in the documentation.
>
> The change would allow the following:
>
> ```
> # global minimizer
> minimize(func, x0, bounds=bounds, method='differential-evolution',
> constraints=constraints)
> # local minimizers
> minimize(func, x0, bounds=bounds, method='SLSQP', constraints=constraints)
> minimize(func, x0, bounds=bounds, method='trust-constr',
> constraints=constraints)
> minimize(func, x0, bounds=bounds, method='L-BFGS-B')
> ```
>
> Please chip in with what your thoughts are, is it a bad idea, good idea,
> etc.
>
> --
> _____________________________________
> Dr. Andrew Nelson
>
>
> _____________________________________
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at python.org
> https://mail.python.org/mailman/listinfo/scipy-dev
>
--
Stefan Endres (MEng, AMIChemE, BEng (Hons) Chemical Engineering)
Postgraduate Student: Institute of Applied Materials
Department of Chemical Engineering, University of Pretoria
Cell: +27 (0) 82 972 42 89
E-mail: Stefan.C.Endres at gmail.com
St. Number: 11004968
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20190908/dbac1d49/attachment.html>
More information about the SciPy-Dev
mailing list