[SciPy-User] SciPy.stats.kde.gaussian_kde estimation of information-theoretic measures

josef.pktd at gmail.com josef.pktd at gmail.com
Thu Sep 24 09:49:41 EDT 2009


On Thu, Sep 24, 2009 at 7:38 AM, Dan Stowell
<dan.stowell at elec.qmul.ac.uk> wrote:
> Hi -
>
> I'd like to use SciPy.stats.kde.gaussian_kde to estimate
> Kullback-Leibler divergence. In other words, given KDE estimates of two
> different distributions p(x) and q(x) I'd like to evaluate things like
>
>    integral of {  p(x) log( p(x)/q(x) )  }
>
> Is this possible using gaussian_kde? The method
> kde.integrate_kde(other_kde) gets halfway there. Or if not, are there
> other modules that can do this kind of thing?
>
> Thanks for any suggestions
> Dan

I never managed to figure out what integrate_kde and
integrate_gaussian in stats.kde are good for. So if you find any hints
or use cases, I would be very glad to hear them.
Both functions are pure python and take the sum over the observed
points. So, it might be possible to extend them to other cases. But
I'm just guessing since I never tried to figure out the theory for
this functions without any use case.

What is the dimension of your x? If it's small enough numerical
integration might also work.

scipy.maxentropy might also have useful functions for this. But it is
also a package that I haven't yet looked in detail.
And without sufficient background, I didn't understand much when looking at it.

I hope someone has a better answer, I would also be interested in it.

Josef


>
> --
> Dan Stowell
> Centre for Digital Music
> School of Electronic Engineering and Computer Science
> Queen Mary, University of London
> Mile End Road, London E1 4NS
> http://www.elec.qmul.ac.uk/department/staff/research/dans.htm
> http://www.mcld.co.uk/
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>



More information about the SciPy-User mailing list