Dr. Dobb's Python-URL! - weekly Python news and links (Mar 17)
robin at jessikat.fsnet.co.uk
Wed Mar 19 20:10:51 CET 2003
In article <mailman.1048094954.14424.python-list at python.org>, Tim Peters
<tim.one at comcast.net> writes
>I think because attempts to define what random means lead naturally to
>normality. If, e.g., pi is shown *not* to be normal, then it would be very
>hard to believe it's random in any reasonable sense. If it is shown to be
>absolutely normal, I expect reasonable people would disagree about whether
>that's strong enough to conclude it's random, though.
I assume absolute normality excludes the case where one expresses the
number in itself as a base or am I being more than usually stupid.
Also I suppose that being non-random implies finiteness (in some sense)
so are we just talking 'symbol' count or information. After all there
are very small symbolic representations of pi, but are they smaller in
information content etc etc.
More information about the Python-list