[SciPy-User] OT: The standard of Bayesian statistics textbooks (rant)

Nathaniel Smith njs at pobox.com
Sat Jun 14 21:42:16 EDT 2014


On 15 Jun 2014 01:26, "Sturla Molden" <sturla.molden at gmail.com> wrote:
>
> Nathaniel Smith <njs at pobox.com> wrote:
>
> > Of course any integral can be written as an expectation, but if you
have a
> > tractable general method for computing the expected value of arbitrary
> > distributions then you should publish it and collect your Fields medal.
>
> In this particular case (pseudocode):
>
> for i in range(n):
>    theta[i] ~ p(theta | M), e.g. by Markov Chain Monte Carlo
>    L[i] = p(y | theta[i] )
> p(y | M) = mean( L[burnin:] )

Sure, that looks like it ought to work great when you have a good sampler
for p(theta | M), the theta space is low dimensional, p(theta|M) is
everywhere on the same order of magnitude as p(theta|M)p(y|theta, M), and
you have a tractable method of computing p(y|theta, M). Writing down
theoretically correct but intractable Bayesian algorithms is usually easy...

Your "particular case" is AFAICT the fully general case of computing
partition functions, i.e. if this worked then all Bayesian models would be
trivial. I'm not sure I've correctly diagnosed all the problems with it,
but I'm pretty sure all Bayesian models are not trivial, so :-).

-n
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.scipy.org/pipermail/scipy-user/attachments/20140615/4365da9f/attachment.html>


More information about the SciPy-User mailing list