[SciPy-Dev] SciPy-Dev Digest, Vol 191, Issue 6
rlucas7 at vt.edu
rlucas7 at vt.edu
Tue Sep 10 19:06:37 EDT 2019
I agree, my opinion is different optimization methods should have different functions.
I’m not sure if the current minimize method via callable() would handle global but if so all the more reason to not monolith.
I’ve answer questions on internal email threads at work about the usage of minimize(), my impression is that although I don’t find it unwieldly, others do and the others seem to be the majority.
Also, conceptually global optimizers are usually differently handled that convex optimizers. Maybe a global optimized callback handler function would work?
-Lucas Roberts
> On Sep 9, 2019, at 8:09 AM, scipy-dev-request at python.org wrote:
>
> Send SciPy-Dev mailing list submissions to
> scipy-dev at python.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://mail.python.org/mailman/listinfo/scipy-dev
> or, via email, send a message with subject or body 'help' to
> scipy-dev-request at python.org
>
> You can reach the person managing the list at
> scipy-dev-owner at python.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of SciPy-Dev digest..."
>
>
> Today's Topics:
>
> 1. Re: feedback on adding global optimiser methods to `minimize`
> (Stefan Endres)
> 2. Re: feedback on adding global optimiser methods to `minimize`
> (Andrew Nelson)
> 3. Re: feedback on adding global optimiser methods to `minimize`
> (Matt Newville)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Sun, 8 Sep 2019 22:15:09 +0200
> From: Stefan Endres <stefan.c.endres at gmail.com>
> To: SciPy Developers List <scipy-dev at python.org>
> Subject: Re: [SciPy-Dev] feedback on adding global optimiser methods
> to `minimize`
> Message-ID:
> <CALhqKPCxFavk+grbtFGf6v16WGs83eHtQCK=kiKAPW-JgkdB7Q at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> I agree that a single unified function for all the routines would be neater
> since all the algorithms have black-box functions as input anyway. I
> believe that any extra optional arguments to the global optimisation
> functions not already defined by `minimize` can be handled with the
> `options` dictionary object passed to the `minimize` function
>
> One additional thought is the possibility of adding a unified `Minimizer`
> class to scipy.optimize. There are two reasons for this:
> (i) Manual control of iterations and/or memoization of progress. For
> example, sometimes the stopping criteria for a global optimisation problem
> is unknown and the optimisation practitioner would like to continue
> iterations according to some external criteria not definable in the
> algorithm itself (this is something that is often requested by `shgo` users
> to me through e-mail correspondence). Another example is with objective
> functions that have very long evaluations and the practitioner would like
> save the progress possibly including all evaluations/current best
> solution(s) and then continue the routine.
> (ii) More importantly to skip a few initiation steps in algorithm selection
> and initiation. This is useful when the function is called thousands to
> millions of times and only a few arguments/parameters need to be changed
> while the algorithm selection etc. remain unchanged. I'm not sure if this
> will have a significant effect for the local routines in `minimize` though.
>
>
>> On Sun, Sep 8, 2019 at 5:57 AM Andrew Nelson <andyfaff at gmail.com> wrote:
>>
>> I'd like to gauge the support for adding the global minimizers
>> (dual_annealing, shgo, differential_evolution, basinhopping) as new
>> `minimize` methods.
>>
>> The problems the the 'local' and 'global' optimizers are trying to solve
>> are very similar and both are specified in the same way, so my thought is
>> that it would be nice to access them via the same unified interface that
>> `minimize` offers (but not deprecating the `shgo` function, etc).
>>
>> It's important that the users are able to understand the distinction
>> between local and global optimisation and how they go about finding a
>> minimum. I'm hoping that this could be made plain in the documentation.
>>
>> The change would allow the following:
>>
>> ```
>> # global minimizer
>> minimize(func, x0, bounds=bounds, method='differential-evolution',
>> constraints=constraints)
>> # local minimizers
>> minimize(func, x0, bounds=bounds, method='SLSQP', constraints=constraints)
>> minimize(func, x0, bounds=bounds, method='trust-constr',
>> constraints=constraints)
>> minimize(func, x0, bounds=bounds, method='L-BFGS-B')
>> ```
>>
>> Please chip in with what your thoughts are, is it a bad idea, good idea,
>> etc.
>>
>> --
>> _____________________________________
>> Dr. Andrew Nelson
>>
>>
>> _____________________________________
>> _______________________________________________
>> SciPy-Dev mailing list
>> SciPy-Dev at python.org
>> https://mail.python.org/mailman/listinfo/scipy-dev
>>
>
>
> --
> Stefan Endres (MEng, AMIChemE, BEng (Hons) Chemical Engineering)
> Postgraduate Student: Institute of Applied Materials
> Department of Chemical Engineering, University of Pretoria
> Cell: +27 (0) 82 972 42 89
> E-mail: Stefan.C.Endres at gmail.com
> St. Number: 11004968
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20190908/dbac1d49/attachment-0001.html>
>
> ------------------------------
>
> Message: 2
> Date: Mon, 9 Sep 2019 09:32:01 +1000
> From: Andrew Nelson <andyfaff at gmail.com>
> To: SciPy Developers List <scipy-dev at python.org>
> Subject: Re: [SciPy-Dev] feedback on adding global optimiser methods
> to `minimize`
> Message-ID:
> <CAAbtOZfgUOCe9PnqSBOw6r_zXv=aNnOtwyOjNP3MFWQOeyNpjA at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
>>
>> I agree that a single unified function for all the routines would be
>> neater since all the algorithms have black-box functions as input anyway. I
>> believe that any extra optional arguments to the global optimisation
>> functions not already defined by `minimize` can be handled with the
>> `options` dictionary object passed to the `minimize` function
>>
>
> Thank you for feedback.
>
> One additional thought is the possibility of adding a unified `Minimizer`
>> class to scipy.optimize. There are two reasons for this:
>>
>
> As you may be aware this idea has been brought forward before, for the
> reasons you suggested and more. The idea stalled unfortunately. Because it
> would be a large changeset it's necessary to get everyone on board first
> (via a SciPEP).
> https://github.com/scipy/scipy/pull/8552 (SciPEP proposing it)
> https://github.com/scipy/scipy/pull/8414 (PR exploring the idea)
> Perhaps another way to bring this about is to setup a separate
> implementation project and show it working before it could be incorporated.
> That would give time and freedom for any issues to be ironed out. The
> development of differential_evolution has certainly benefitted from the
> solver being class based, but also from being a private interface, which
> has allowed relatively large changes to be made to its functionality.
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20190909/d3630586/attachment-0001.html>
>
> ------------------------------
>
> Message: 3
> Date: Mon, 9 Sep 2019 07:08:30 -0500
> From: Matt Newville <newville at cars.uchicago.edu>
> To: SciPy Developers List <scipy-dev at python.org>
> Subject: Re: [SciPy-Dev] feedback on adding global optimiser methods
> to `minimize`
> Message-ID:
> <CA+7ESbr22DNS2YM_Z7ku=TK4PV1VGgaH+m3Jv8Z9-jieiK-Nvg at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
>> On Sat, Sep 7, 2019 at 10:57 PM Andrew Nelson <andyfaff at gmail.com> wrote:
>>
>> I'd like to gauge the support for adding the global minimizers
>> (dual_annealing, shgo, differential_evolution, basinhopping) as new
>> `minimize` methods.
>>
>> The problems the the 'local' and 'global' optimizers are trying to solve
>> are very similar and both are specified in the same way, so my thought is
>> that it would be nice to access them via the same unified interface that
>> `minimize` offers (but not deprecating the `shgo` function, etc).
>>
>> It's important that the users are able to understand the distinction
>> between local and global optimisation and how they go about finding a
>> minimum. I'm hoping that this could be made plain in the documentation.
>>
>> The change would allow the following:
>>
>> ```
>> # global minimizer
>> minimize(func, x0, bounds=bounds, method='differential-evolution',
>> constraints=constraints)
>> # local minimizers
>> minimize(func, x0, bounds=bounds, method='SLSQP', constraints=constraints)
>> minimize(func, x0, bounds=bounds, method='trust-constr',
>> constraints=constraints)
>> minimize(func, x0, bounds=bounds, method='L-BFGS-B')
>> ```
>>
>> Please chip in with what your thoughts are, is it a bad idea, good idea,
>> etc.
>>
>> --
>>
>
>
> Personally, I find `minimize()` to be a bit unwieldy I do understand the
> desire to make things simple and uniform, but I'm not sure `minimize()` is
> really doing that. Correct me if any of this is wrong, but the interface
> for the current `minimize()` has the following features:
>
> 1. `minimize` takes 2 required arguments: func, x0. So far, so good.
> 2. There are 14 named "methods" and a "custom object" method (which,
> arguably means that the "global" methods are already supported).
> 3. There are 3 optional arguments used for all methods with sensible
> defaults: "args" , "tol" , and "callback".
> 4. The "method" argument is actually optional with the default used being
> a slightly complicated combination of other optional arguments.
> 5. There are 6 optional arguments that each depends on which "method" is
> used.
> 6. Most but not all of the 14 methods support "jac". 4 of 6 options
> ("hess", "hessp", "bounds", "constraints") are supported by fewer than half
> the methods.
> 7. One of the 6 optional arguments is called "options" and holds keyword
> options specific to each method and not explicitly listed.
> 8. Some of the options in "options" (say, "eps", "ftol") are supported by
> more methods than some of the explicitly named optional arguments (say,
> "constraints").
> 9. "constraints" is supported for 3 methods. It has 2 incompatible forms.
> 10. "bounds" can either be a tuple of (called "min", "max" in the docs)
> values or a "Bounds" object that has attributes "lb" and "ub". Are "lower"
> and "upper" are too verbose?
> 11. As if to stick a finger in your eye, "callback" requires a different
> signature for exactly one of the 14 methods.
>
> It is all very well documented. But it is also extremely complicated.
> And, honestly, not very good.
>
> It seems to me that there are really 14 (or 15) different minimization
> methods. The `minimize()` function smashes these all into one function
> with a jumble of options. I think the idea was to emphasize the
> commonality of the interfaces, but the result sort of emphasizes what is
> not common among the methods. Instead of creating a common way to say
> "Bounds", "Constraints", "Hessian", "Tolerance", "Max Number of Function
> Calls", or "Step Size" for all methods, it allows multiple variations and
> then works extra hard to document the differences.
>
> How does this help the user (or person writing a wrapping library)? Can
> the user just change the name of the method argument? In a simple case
> that might work. But, in general, No, definitely not. Too many of the
> arguments change validity when the `method` is changed. To the extent that
> the user could change "minimize(...., method='foo', ...)" to
> "minimize(...., method='bar', ...)", they could just as easily change
> "_minimize_foo(....)" to "_minimize_bar(...)". That actually is less
> typing.
>
> I guess in a fatalistic sense, I would say "Sure, why not add more
> methods? You aren't going to make it any worse". Except that you will
> want to add more options that are not common to the other methods.
> Differential evolution itself has a dozen or so "strategy" options
> ("strategy" being obviously different from "method", I guess). There are
> "seeds" and "nworkers" and "max number of iterations" (whatever
> "iteration" means for each method). I do not know if any of these have
> uniform meaning; I guess not. It looks to me like "callback" would have
> yet another variation. And, eventually, you will want to deprecate the
> current working functions, breaking downstream code for the sake of a
> uniform API that really is not uniform.
>
> The minimization methods really are different. They have different
> options. IMO, it would be better to have one function per method and avoid
> the a sprawling mess of a dispatch function. It would be nice if the
> interfaces were as similar as possible, with similar concepts using
> consistent names and values.
>
> I would definitely encourage a class-based Minimizer, but mostly in the
> sense of being able to use inheritance to make it easier to have a common
> interface across the different solvers: so not one Minimizer, but a
> BaseMinizer and then PowellMinizer, BFGSMinimizer, KrylovMinimizer, etc. I
> would suggest looking at the stats module and scikit-learn for
> inspiration. And *then* make powell_minimize(), etc functions that create
> and use these. Oh, and a Parameters class would be nice ;).
>
> I have no expectation of being listened too. We'll all be able to work
> around the added complexity, just like we can sort of handle the similarly
> infuriating complexity of `least_squares` (by basically treating it as 3
> separate functions).
>
> Finally, I will say that I am very glad to have `numpy.sin(x)`,
> `numpy.cos(x)`, etc instead of `numpy.ufunc(x, method='sin')`.
>
> Sorry if that seems like a rant.
> Cheers,
>
> --Matt
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20190909/0a157f0c/attachment.html>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at python.org
> https://mail.python.org/mailman/listinfo/scipy-dev
>
>
> ------------------------------
>
> End of SciPy-Dev Digest, Vol 191, Issue 6
> *****************************************
More information about the SciPy-Dev
mailing list