David C. Ullrich
ullrich at math.okstate.edu
Fri Jun 1 21:40:53 CEST 2001
On Fri, 1 Jun 2001 18:16:19 +0200, "Alex Martelli" <aleaxit at yahoo.com>
>"David C. Ullrich" <ullrich at math.okstate.edu> wrote in message
>news:3b17a2f1.411766 at nntp.sprynet.com...
>> Could you give us a hint regarding exactly what work
>> of Chaitin's shows we can get "truly random" numbers from
>> arithmetic algorithms?
>I think his latest book, "Exploring Randomness" published
>by Springer-Verlag, is probably 'it'... gotta order it one
>of these days (it's been out a month and I yet haven't...
>sigh, my sloth's getting worse... to my excuse, the UK
>branch of Amazon claims it's not there yet, and Chaitin's
>page mentions a Springer-Verlag special offer on all 3
>of his books but Springer-Verlag seem to never have heard
>of it, and...).
>> A physical RNG can have the property that one cannot make
>> any prediction about the value of the next bit even given
>> _complete_ information about how the numbers are being
>"From impossibilia sequitur quodlibet". *IF* you could
>have complete information, *THEN* you could predict all
>you want. Problem is, you CANNOT have complete info due
>to that Heisenberg fellow, therefore you cannot predict
Huh? You cannot have complete information about a
_physical_ RNG, which is why you _can_ get "really
random" numbers from a physical RNG. But you certainly
can have complete information regarding the state
of an _arithmetic_ _algorithm_.
> From what I know of Chaitin's life work,
>it started with just the idea of defining randomness in
>terms of minimal amount of information (in bits) you need
>for prediction. If YOUR definition of 'randomness' is
>something that needs INFINITE number of bits to predict,
>go ahead and have fun, but you're unlikely to get some
>constructive results out of it.
It seems fairly likely that there's a point here, but
I don't see what it is. Given an arithmetic algorithm
it does _not_ take an infinite number of bits to do
> Chaitin, Martin-Loef,
>and Solovay, do have various definitions that ARE of some
>use, and apparently they all turn out to be equivalent
>(and "Exploring Randomness" includes, inter alia, proofs
>of the equivalence that a layman DOES have a chance to
>follow -- I'm told).
>A physical system that is macroscopic enough to escape
There is no such system.
> could also be describable in
>a finite number of bits, in terms of enabling somebody
>to predict the 'next outcome'. That was, if I'm not
>mistaken, Laplace's take on probability (and not his
>only), all the way up to Einstein's in this century --
>that it's all about hidden variables, information that
>we do not happen to have, rather than information that
>intrinsically CANNOT be had.
I continue to miss your point. See, Einstein was
wrong about quantum mechanics (or so the people best
qualified to know believe). I forget whose theorem
it is (Von Neumann's, I think maybe!) that you
_cannot_ describe "classical quantum mechanics" in
terms of hidden variables.
>Measuring randomness in terms of amount of information
>seems like it works nicely.
I'm certain it does. I don't see how saying that
says anything about VN being wrong with his comment
about how arithmetic RNG's cannot be perfect.
> More when I _have_ managed
>to get and study "Exploring Randomness", if it's not
>too far over my head...!
David C. Ullrich
"Sometimes you can have access violations all the
time and the program still works." (Michael Caracena,
More information about the Python-list