`optimize.minimize` offers a choice of many different methods for multivariate scalar minimisation. These methods are chosen using the `method` keyword.

There are also different global minimisation routines that one can use (differential_evolution, basinhopping, dual_annealing, shgo). These minimisers have the same overall objective as `minimize`, just with a different approach to finding a minimum. The global minimiser routines are called individually, and are not accessible through the `minimize` function as different methods. A PR is open at https://github.com/scipy/scipy/pull/10778 which proposes to add a `differential-evolution` method to `minimize` that would permit this. This is a fairly straightforward change as the call interfaces are almost identical, and the problems are posed in similar ways.

There are obviously pros and cons to this:

Pros
------
- One could call any of the multivariate scalar minimizers through one function.
- In user code this could simplify code significantly (code that offers all the different minimizers has to use if/elif constructs to call different functions depending on the method to be used).

Cons
-------
- A user may not appreciate the differences of how local and global minimisers work. e.g. a lot of the global minimisers are stochastic and some use local minimisers to polish the end solution.

Could we have a discussion as to whether people think this is a good/bad idea? Would it confuse users, would it make `minimize` too convoluted, etc?

A.