[scikit-learn] GPR intervals and MCMC

Michael Eickenberg michael.eickenberg at gmail.com
Tue Nov 8 10:24:01 EST 2016


Dear Alessio,

if it helps, the implementation quite strictly follows what is described in
GPML: http://www.gaussianprocess.org/gpml/chapters/

https://github.com/scikit-learn/scikit-learn/blob/412996f09b6756752dfd3736c306d46fca8f1aa1/sklearn/gaussian_process/gpr.py#L23

Hyperparameter optimization is done by gradient descent.

Michael

On Tue, Nov 8, 2016 at 4:10 PM, Quaglino Alessio <alessio.quaglino at usi.ch>
wrote:

> Hello,
>
> I am using scikit-learn 0.18 for doing GP regressions. I really like it
> and all works great, but I am having doubts concerning the confidence
> intervals computed by predict(X,return_std=True):
>
> - Are they true confidence intervals (i.e. of the mean / latent function)
> or they are in fact prediction intervals? I tried computing the prediction
> intervals using sample_y(X) and I get the same answer as that returned by
> predict(X,return_std=True).
>
> - My understanding is therefore that scikit-learn is not fully Bayesian,
> i.e. it does not compute probability distributions for the parameters, but
> rather the values that maximize the likelihood?
>
> - If I want the confidence interval, is my best option to use an external
> MCMC optimizer such as PyMC?
>
> Thank you in advance!
>
> Regards,
> -------------------------------------------------
> Dr. Alessio Quaglino
> Postdoctoral Researcher
> Institute of Computational Science
> Università della Svizzera Italiana
>
>
>
>
>
>
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20161108/2f0021d6/attachment-0001.html>


More information about the scikit-learn mailing list