random

David C. Ullrich ullrich at math.okstate.edu
Sat Jun 2 16:07:15 CEST 2001


On Sat, 2 Jun 2001 00:55:15 +0200, "Alex Martelli" <aleaxit at yahoo.com>
wrote:

>"David C. Ullrich" <ullrich at math.okstate.edu> wrote in message
>news:3b17ed86.19508127 at nntp.sprynet.com...
>    ...
>> >> A physical RNG can have the property that one cannot make
>> >> any prediction about the value of the next bit even given
>> >> _complete_ information about how the numbers are being
>> >
>> >"From impossibilia sequitur quodlibet".  *IF* you could
>> >have complete information, *THEN* you could predict all
>> >you want.  Problem is, you CANNOT have complete info due
>> >to that Heisenberg fellow, therefore you cannot predict
>> >(precisely).
>>
>> Huh? You cannot have complete information about a
>> _physical_ RNG, which is why you _can_ get "really
>> random" numbers from a physical RNG. But you certainly
>
>The quote above contains the premise "even given _complete_
>information".  Now you say "you cannot have complete
>information".  Well, if this latter statement is true, then
>the premise is always false, and from a false premise
>any conclusion follows, as Pseudo-Scotus proved and I
>recalled in his very words -- except I used 'from' instead
>of 'ab' as a hint.

Yeah, well thankks for explaining how logic works.

I said you cannot have complete information about
a physical system. You can't. An algorithm is not
a physical system.

>So, to recap: the post I was replying to states that given
>[what you now state is] an impossibility, 

No. I never stated that it was impossible to have
complete information about the state of an algorithm.

>yet a conclusion
>[predicibility of next bit] does NOT follow.  Pseudo-Scotus
>says that given an impossibility ANYTHING follows (you
>can prove any theorem and its contrary if your set of
>axioms include a contradiction).
>
>
>> can have complete information regarding the state
>> of an _arithmetic_ _algorithm_.
>
>I assume that is your definition of "an algorithm"? 

It's not the _definition_, (and to the extent that
it is part of a definition it's not _my_ definition).

> That
>its state be finite?  While you are absolutely certain that
>NO physical system can be described in finite terms,
>EVER?  Wow, last time I met anybody so sure of things,
>it was a priest:-).

Take it up with Heisenberg. If it's possible to have
complete information about a physical system then
physics is simply _wrong_.

>> > From what I know of Chaitin's life work,
>> >it started with just the idea of defining randomness in
>> >terms of minimal amount of information (in bits) you need
>> >for prediction.  If YOUR definition of 'randomness' is
>> >something that needs INFINITE number of bits to predict,
>> >go ahead and have fun, but you're unlikely to get some
>> >constructive results out of it.
>>
>> It seems fairly likely that there's a point here, but
>> I don't see what it is. Given an arithmetic algorithm
>> it does _not_ take an infinite number of bits to do
>> any predicting.
>
>Agreed (given a definition of algorithm that requires it
>to be finite, this is a tautology).  So, the amount of
>randomness (which is a finite amount of randomness)
>is described as the number of bits (a finite one) needed
>for said prediction.  (Over-simplifying a bit here:-).
>
>If you think interesting and fruitful mathematical theories
>can be constructed around measures of randomness which
>only consider infinities, go right ahead and don't let the
>mathematicians scoffing at you put you off -- if Feynman
>could simplify away infinities and come up with stuff,
>why not you, after all?

I really don't have any idea what you're getting at here.
I never said anything about what's fruitful or interesting
or useful. JVN said anyone using arithmetic methods to
get random numbers is in a state of sin. It was stated
that Chaitin has shown that this is wrong. I didn't see
how, but I'm not familiar with his work, so I asked 
how Chaitin's work showed that JVN was wrong. So far
I haven't heard the esplanation.

Probably "is in a state of sin" is not well-defined.
I take the comment to mean that there is no "percect"
arithmetical RNG. It seems extremely clear to me that
there isn't, and I haven't heard anything that explains
how Chaitin's work _does_ show that there exists
a perfect arithemtical RNG.

I didn't claim that his work was irrelevant. I didn't
claim it had no practical importance. I didn't claim
that it did not lead to an RNG that's perfect for all
practical purposes.

In fact I didn't claim _anything_ except that I didn't
see how there can be a _perfect_ arithmetic RNG,
Chaitin or no Chaitin. That's a _true_ statement
about what I don't see.

>Meanwhile, some of us working with finite machines,
>communication channels of finite capacity, and so on,
>are very interested in the currently accepted, operational
>definitions of randomness in finite terms given by Chaitin,
>Solovay, and so on.  It's about mathematics rather than
>physics, and rather 'solid' maths in a sense -- it seems
>Chaitin's books are chock full of working Lisp code that
>exemplifies what he's saying.

Indeed. Hanging out on sci.math the things that
people said about Chaitin's work made him sound like
a total crackpot. I was relieved when I saw a little
bit of his stuff once - the ridiculous aspects of
what people had said about what he'd done were 
not due to him, they were due to people misquoting
and oversimplifying his stuff. (Not referring to
anything you've said here. Things like statements
about measuring the complexity of an arithmetic
function in some _absolute_ sense, without first
saying what formal system (or computer language)
is being used to specify the algorithm. Great
relief to see Chaitin _beginning_ by saying
of course all his statements about computational
complexity must be relative to a specific language,
then _defining_ that language _specifically_
before moving on.)

>> >A physical system that is macroscopic enough to escape
>> >Heisenbergian conumdrums
>>
>> There is no such system.
>
>In this case, no physical computer is predictable, and all
>we study about algorithms are abstractions 

Well of _course_ algorithms are abstractions.

>-- since a
>computer system IS a physical system, then ALL bits
>coming out from it must be random in your sense of
>the word. 

I've never defined the word. Yes, of course the bits
you get out of a computer are always _somewhat_
random.

> Defining things in terms of infinities, absolutes
>and utter unshakable certainties does tend to have this
>risk -- discussion is trivialized rather than concrete and
>fruitful.  But, hey, if that's what floats your boat...

"Fruitful" in what sense? Even for someone who cares
about nothing but what he can actually do with a machine
knowing abstract things about what can and cannot
be done is useful. If something is provably impossible
then that can save people time trying to do it, for
example.

>If, on the other hand, you maintain that the output
>from a physical system CAN be predictable if that
>system is a computer running your program,

Assuming that we're taking "predictable" in an
absolute sense, why in the world would I maintain
anything so utterly ridiculous as insisting that
the output of an actual physcial computer is
predictable? I've been talking about _algorithms_.
A machine cannot implement an algorithm perfectly.

>then
>what makes you SO certain that NO other physical
>systems can also be predictable in the same sense?-)

Well, if I'd claimed that a physical computer was
predictable you'd have a point. 

But claiming that no physical system is predicatable
_except_ for computers is (i) not something I said,
(ii) so obviouslt ridiculous that I don't see why you
would think that that's what I was claiming from
anything I _have_ said.

>> > could also be describable in
>> >a finite number of bits, in terms of enabling somebody
>> >to predict the 'next outcome'.  That was, if I'm not
>> >mistaken, Laplace's take on probability (and not his
>> >only), all the way up to Einstein's in this century --
>> >that it's all about hidden variables, information that
>> >we do not happen to have, rather than information that
>> >intrinsically CANNOT be had.
>>
>> I continue to miss your point. See, Einstein was
>> wrong about quantum mechanics (or so the people best
>> qualified to know believe). I forget whose theorem
>> it is (Von Neumann's, I think maybe!) that you
>> _cannot_ describe "classical quantum mechanics" in
>> terms of hidden variables.
>
>A couple centuries ago all scientists were certain of
>one thing, now they're all becoming certain of another,
>and I won't be around in a couple more centuries to
>laugh at the likely-different certainties they'll have
>then.  Meanwhile, mathematics is far from exempt
>from paradigm shifts, of course, but those never do
>'contradict' the previous certainties (META-maths is
>different that way of course -- cfr. Russel's well
>known demolition of Frege's lifework:-), rather
>extend and generalize on them.
>
>I'm currently more interested in what a mathematician
>like Chaitin has to say about randomness, particularly
>of the finite kind which I can handle in a computer (if
>and only if it's a physically predictable system:-) than
>in what physicists or other philosophers have to say
>about randomness, particularly if that requires dabbling
>with infinities.  Being an engineer myself, I do place a
>prize on USEFUL theories, after all.

Right. But I didn't actually ask what you were interested
in. I was puzzled how Chaitin's work had shown that
JVN was wrong - I don't see how the question of what
you in particular are or are not interested in can
possibly have any bearing on this.

>> >Measuring randomness in terms of amount of information
>> >seems like it works nicely.
>>
>> I'm certain it does. I don't see how saying that
>> says anything about VN being wrong with his comment
>> about how arithmetic RNG's cannot be perfect.
>
>If by "perfect" randomness VN meant an infinite amount
>thereof (by Chaitin measure), he might have said that
>(were it not for the little detail that Chaitin was probably
>not even in primary school back then, I guess).  The
>quote talks generically about "random", without using
>the word "perfect" you now insert, and without any
>specification of infiniteness as a prerequisite, and, as
>such and without the needed qualifications, it is not
>correct.  _with_ the qualifications you want to place,
>it seems to become tautological ("I define X as being
>something that requires infinities: no finite system can
>reach X, it can only approach it")

Another thing that's been puzzling me is where I've
said anything about these infinities that you continue
to quote me about.

> and therefore sterile
>and uninteresting,

Sigh. Totally useless abstractions with no possible
practical importance can be of great practical importance.
An example (consisting of two examples):

(i) "Perfect chess is trivial." That's a mathematical
fact: Chess is a finite tree - to figure out how to
play perfect chess you start at the bottom of the
tree and mark each node as a win for White, a win
for Black or a draw, depending on the value of the
subnodes.

Now, that's silly, because of course it would take
much longer than the age of the universe to do this.

But it's not so silly, because the fact that perfect
chess is trivial in this abtract sense means that
it's theoritically possible to build a machine to
play perfect chess, although very slowly, and so
it follows that it's not totally pointless to try
to build a machine that will play fairly good chess
at a reasonable speed. As opposed to:

(ii) There is no algorithm that will determine whether
or not a "statement" in some formal language is a
theorem.

That bit of abstraction _does_ indicate that it is
a waste of time to try to build a maching that
will determine whether a given statement is a
theorem (and which will alwats return the right
answer).


My point being that knowledge about useless
abstractions can be useful. It's the useless
abstractions, what is and what is not
possible, that I've been meaning to ask about
here.

> unless there are hidden layers of
>exoteric meaning nesting in it, I guess.

Um. When he says that anyone doing something is
"in a state of sin" then it's silly to assume
that he's said something with a precisely defined
meaning; it's clear that some sort of interpretation
is required.

>Alex
>
>
>



David C. Ullrich
*********************
"Sometimes you can have access violations all the 
time and the program still works." (Michael Caracena, 
comp.lang.pascal.delphi.misc 5/1/01)



More information about the Python-list mailing list