Compression of random binary data
Gregory Ewing
greg.ewing at canterbury.ac.nz
Sat Oct 28 23:31:52 EDT 2017
Steve D'Aprano wrote:
> I don't think that's right. The entropy of a single message is a well-defined
> quantity, formally called the self-information.
>
> https://en.wikipedia.org/wiki/Self-information
True, but it still depends on knowing (or assuming) the
probability of getting that particular message out of
the set of all possible messages.
This is *not* what danceswithnumbers did when he
calculated the "entropy" of his example bit sequences.
He didn't define the set they were drawn from or
what their probabilities were.
--
Greg
More information about the Python-list
mailing list