[SciPy-Dev] Return residual and derivative in gradient descent methods like Newton

Andrew Nelson andyfaff at gmail.com
Mon Jun 25 18:12:20 EDT 2018


>
> On Mon, Jun 25, 2018 at 10:30 AM, Mark Alexander Mikofski
>> <mikofski at berkeley.edu> wrote:
>> >
>> > Would anyone disagree or would anyone be interested in a proposal to
>> allow
>> > the derivative to be returned from the user supplied function as an
>> optional
>> > second argument in gradient search method like Newton? EG
>> >
>> >>>> lambda x,a: (x**3-a, 3*x*"2)
>> >>>> newton(f, x0, fprime='f2', args=(a,))
>> >
>> > Some simple tests show that this may have a 2X speed in cases where the
>> > derivative expression requires the value of the original function call.
>>
>
The `minimize` function does something like this for the `jac` keyword: "If
*jac* is a Boolean and is True, *fun* is assumed to return the gradient
along with the objective function.".
If something like this is desirable, then I suggest that the same minimize
pattern is used.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20180626/b8f4e14b/attachment.html>


More information about the SciPy-Dev mailing list