# Mutual information ?

Mikael Olofsson mikael at isy.liu.se
Tue Jan 21 16:04:58 CET 2003

```On Tue, 21 Jan 2003 09:08:06 -0500
Peter Hansen <peter at engcorp.com> wrote:
> Let us know what you find.  Better yet, let us know what the heck
> "mutual information" actually is!  ;-)

The mutual information that I am aware of is a concept from information
theory, related to (information) entropy and channel capacity.

A quote from Cover & Tomas, "Elements of Information Theory", p 18:

[Mutual information] is a measure of the amount of information that
one random variable contains about another random variable.

Definition:

p(x,y)
I(X;Y) = sum     sum    p(x,y) log --------
x in X  y in Y             p(x)p(y)

where 2 is the base of the log, and where p(*) is probability of *. If X
is the input to a noisy channel and Y is the corresponding output of
that channel, then the maximum of I(X;Y) - taken over all possible
probability distributions of X - is the capacity of that channel.

That means that if we try to communicate more independent bits than that
number (the capacity), then it is impossible to do that free of errors.
Conversely, it is possible to communicate less bits than that with a
probability of error that is arbitrarily low. That is the essence of the
so called channel coding theorem. However, that theorem only states
possibilities, not how to do that. And that is what keep the scientific
fields "information theory", "coding theory", and other related areas
alive.

It does take some time to wrap ones mind around that...

/Mikael

-----------------------------------------------------------------------
E-Mail:  mikael at isy.liu.se
WWW:     http://www.dtr.isy.liu.se/dtr/staff/mikael
Phone:   +46 - (0)13 - 28 1343
Telefax: +46 - (0)13 - 28 1339

/"\
\ /     ASCII Ribbon Campaign
X      Against HTML Mail
/ \

This message was sent by Sylpheed.
-----------------------------------------------------------------------