Hi all,
I am trying to calculate a Hessian. I am using numdifftools for this ( https://pypi.python.org/pypi/Numdifftools).
My question is, is it possible to make it using pure numpy?.
The actual code is like this:
*import numdifftools as nd* *import numpy as np*
*def log_likelihood(params):* * sum1 = 0; sum2 = 0* * mu = params[0]; sigma = params[1]; xi = params[2]* * for z in data:* * x = 1 + xi * ((z-mu)/sigma)* * sum1 += np.log(x)* * sum2 += x**(-1.0/xi)* * return -((-len(data) * np.log(sigma)) - (1 + 1/xi)*sum1 - sum2) # negated so we can use 'minimum'*
*kk = nd.Hessian(log_likelihood)*
Thanks in advance.
Your function looks fairly simple to differentiate by hand, but if you have access to the gradient (or you estimate it numerically using scipy...), this function might do the job:
def hessian ( x, the_func, epsilon=1e-8): """Numerical approximation to the Hessian Parameters ------------ x: array-like The evaluation point the_func: function The function. We assume that the function returns the function value and the associated gradient as the second return element epsilon: float The size of the step """
N = x.size h = np.zeros((N,N)) df_0 = the_func ( x )[1] for i in xrange(N): xx0 = 1.*x[i] x[i] = xx0 + epsilon df_1 = the_func ( x )[1] h[i,:] = (df_1 - df_0)/epsilon x[i] = xx0 return h
Jose
On 8 August 2014 08:31, Kiko kikocorreoso@gmail.com wrote:
Hi all,
I am trying to calculate a Hessian. I am using numdifftools for this ( https://pypi.python.org/pypi/Numdifftools).
My question is, is it possible to make it using pure numpy?.
The actual code is like this:
*import numdifftools as nd* *import numpy as np*
*def log_likelihood(params):*
- sum1 = 0; sum2 = 0*
- mu = params[0]; sigma = params[1]; xi = params[2]*
- for z in data:*
x = 1 + xi * ((z-mu)/sigma)*
sum1 += np.log(x)*
sum2 += x**(-1.0/xi)*
- return -((-len(data) * np.log(sigma)) - (1 + 1/xi)*sum1 - sum2) #
negated so we can use 'minimum'*
*kk = nd.Hessian(log_likelihood)*
Thanks in advance.
NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Do it in pure numpy? How about copying the source of numdifftools?
What exactly is the obstacle to using numdifftools? There seem to be no licensing issues. In my experience, its a crafty piece of work; and calculating a hessian correctly, accounting for all kinds of nasty floating point issues, is no walk in the park. Even if an analytical derivative isn't too big a pain in the ass to implement, there is a good chance that what numdifftools does is more numerically stable (though in all likelihood much slower).
The only good reason for a specialized solution I can think of is speed; but be aware what you are trading it in for. If speed is your major concern though, you really cant go wrong with Theano.
http://deeplearning.net/software/theano/library/gradient.html#theano.gradien...
2014-08-08 16:37 GMT+02:00 Eelco Hoogendoorn hoogendoorn.eelco@gmail.com:
Do it in pure numpy? How about copying the source of numdifftools?
Of course it is a solution. I was just wondering if it exist something similar in the numpy/scipy packages so I do not have to use a new third party library to do that.
What exactly is the obstacle to using numdifftools? There seem to be no licensing issues. In my experience, its a crafty piece of work; and calculating a hessian correctly, accounting for all kinds of nasty floating point issues, is no walk in the park. Even if an analytical derivative isn't too big a pain in the ass to implement, there is a good chance that what numdifftools does is more numerically stable (though in all likelihood much slower).
The only good reason for a specialized solution I can think of is speed; but be aware what you are trading it in for. If speed is your major concern though, you really cant go wrong with Theano.
http://deeplearning.net/software/theano/library/gradient.html#theano.gradien...
Thanks, it seems that NumDiffTools is the way to go.
NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
2014-08-08 11:51 GMT+02:00 Jose Gomez-Dans jgomezdans@gmail.com:
Your function looks fairly simple to differentiate by hand, but if you have access to the gradient (or you estimate it numerically using scipy...), this function might do the job:
def hessian ( x, the_func, epsilon=1e-8): """Numerical approximation to the Hessian Parameters ------------ x: array-like The evaluation point the_func: function The function. We assume that the function returns the function value and the associated gradient as the second return element epsilon: float The size of the step """
N = x.size h = np.zeros((N,N)) df_0 = the_func ( x )[1] for i in xrange(N): xx0 = 1.*x[i] x[i] = xx0 + epsilon df_1 = the_func ( x )[1] h[i,:] = (df_1 - df_0)/epsilon x[i] = xx0 return h
Jose
Hi José,
Thanks for the answer.
My idea would be to generalise the calculation of the Hessian, not just to differentiate the example I posted and I was wondering if Numpy/Scipy already had something similar to that provided by NumDiffTools.
Thanks again.
On 8 August 2014 08:31, Kiko kikocorreoso@gmail.com wrote:
Hi all,
I am trying to calculate a Hessian. I am using numdifftools for this ( https://pypi.python.org/pypi/Numdifftools).
My question is, is it possible to make it using pure numpy?.
The actual code is like this:
*import numdifftools as nd* *import numpy as np*
*def log_likelihood(params):*
- sum1 = 0; sum2 = 0*
- mu = params[0]; sigma = params[1]; xi = params[2]*
- for z in data:*
x = 1 + xi * ((z-mu)/sigma)*
sum1 += np.log(x)*
sum2 += x**(-1.0/xi)*
- return -((-len(data) * np.log(sigma)) - (1 + 1/xi)*sum1 - sum2) #
negated so we can use 'minimum'*
*kk = nd.Hessian(log_likelihood)*
Thanks in advance.
NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion