[Numpy-discussion] automatic differentiation with PyAutoDiff
srean.list at gmail.com
Thu Jun 14 19:18:16 EDT 2012
> I second James here, Theano do many of those optimizations. Only
> advanced coder can do better then Theano in most case, but that will
> take them much more time. If you find some optimization that you do
> and Theano don't, tell us. We want to add them :)
I am sure Theano does an excellent job of expressions that matter. But
I think to get the best symbolic reduction of an expression is a hard,
as in, an AI hard problem. Correct me if I am wrong though.
One can come up with perverse corner cases using algebraic or
trigonometric identities, expressions that are hundreds of terms long
but whose derivatives are simple, perhaps even a constant.
But all that matters is how well it does for the common cases and am
hearing that it does extremely well.
I will be happy if it can reduce simple things like the following (a
very common form in Theano's domain)
\phi(x) - \phi(y) - dot( x-y, \grad_phi(y))
evaluated for \phi(x) = \sum_i (x_i log x_i) - x_i
\sum_i x_i log(x_i / y_i) on the set sum(x) = sum(y) = 1
In anycase I think this is a digression and rather not pollute this
thread with peripheral (nonethless very interesting) issues.
More information about the NumPy-Discussion