Beazley's Problem
Annada Behera
annada at tilde.green
Tue Sep 24 04:25:57 EDT 2024
-----Original Message-----
From: Paul Rubin <no.email at nospam.invalid>
Subject: Re: Beazley's Problem
Date: 09/24/2024 05:52:27 AM
Newsgroups: comp.lang.python
>> def f_prime(x: float) -> float:
>> return 2*x
>
>You might enjoy implementing that with automatic differentiation (not
>to be confused with symbolic differentiation) instead.
>
>http://blog.sigfpe.com/2005/07/automatic-differentiation.html
Before I knew automatic differentiation, I thought neural networks
backpropagation was magic. Although coding up backward mode autodiff is
little trickier than forward mode autodiff.
(a) Forward-mode autodiff takes less space (just a dual component of
every input variable) but needs more time to compute. For any function:
f:R->R^m, forward mode can compute the derivates in O(m^0)=O(1) time,
but O(m) time for f:R^m->R.
(b) Reverse-mode autodiff requires you build a computation graph which
takes space but is faster. For function: f:R^m->R, they can run in
O(m^0)=O(1) time and vice versa ( O(m) time for f:R->R^m ).
Almost all neural network training these days use reverse-mode autodiff.
More information about the Python-list
mailing list