random

David C. Ullrich ullrich at math.okstate.edu
Wed Jun 6 11:15:48 EDT 2001


On 5 Jun 2001 16:54:34 GMT, urpala at ampeeri.ee.tut.fi (Uoti Urpala)
wrote:

>In article <3b1cf00f.5441109 at nntp.sprynet.com>, David C. Ullrich wrote:
>>Indeed the exact value of Omega depends on the details of the
>>encodings. What I'm wondering today is how many of his conclusions
>>also depend on the details of the encoding.
>
>>Suppose I have a syntax where a valid program is precisely
>>a sequence of 0's terminated by a 1. Any such string is a
>>valid program. Does this count as "self-delimiting"? 
>
>That probably does count as self-delimiting, but it doesn't fill other
>requirements of the results. The encoding must allow "arbitrary binary
>data" to be included. 

I didn't bother mentioning the semantics before getting straight
whether the syntax counted as "self-delimiting". It is very
easy to design a Turing-complete programming language where
the syntactically valid programs are exactly the bitstrings
consisting of all 0's terminated by a 1. Assunming that
"arbitrary binary data" means only finitely much binary
data that's no problem.

(It's not going to be an easy programming language to read or
write<g> but a syntax checker is easy...)

>The proof basically says "you need n bits of
>program to calculate n bits of Omega". This is a strong result if the
>encoding is efficient, but the above encoding needs a lot of bits to
>calculate anything, so in that case the result doesn't say much about
>the associated Omega (and the same proof wouldn't work, since it
>depends on the ability to give an arbitrary program as an argument to
>another with limited overhead).

What I'm trying to track down is what does make the proof work.
In particular what I'm curious about is whether the requirments
that make the proof of the randomness work are really all that
natural, or whether they were just chosen to make the randomness
come out right in the end...

I can easily give an argitrary program to another in the silly
0000....1 syntax, but not with limited overhead. (I think.)
Looking through more of Chaitin yesterday it seemed like the
bit about how H((X,Y)) <= H(X) + H(Y) + O(1) was probably
kind of important (the complexity of a program generating the
concatentation of two lists is bounded by the sum of the
complexities, more or less.)

I should probably either just give up on extracting an
understanding of this or give up and dive into the actual
stuff, as has been suggested. Not really motivated enough
to learn LISP - also I don't care much about results that
are valid for one particular LISP encoding. You know a 
place to start that's going to be proving things based
on _assumptions_ about an arbitrary encoding (as opposed
to proving things about a specific encoding, where one
then has to dig in and figure out what aspects of the
encoding are relevant)?

>Proving the randomness of Omega basically works like this:
>You can create a program that, given the first n bits of Omega, solves
>the halting problem for all programs of size <= n. (Details below)
>If you can calculate the first m bits of Omega with a program of size
>p, you can add a bit of extra code to create a program of size p+e (e
>constant) that solves the halting problem for itself and then does the
>opposite (the standard contradiction for a program solving the halting
>problem for itself). So p+e must be greater than m to avoid a
>contradiction, and a program to calculate the first m bits of Omega
>must have size at least m-e. (You can generalize this to calculating
>_any_ bits of Omega in whatever position - if you can calculate some
>with a short program, you can hardcode the missing ones to get a full
>initial segment).
>
>Solving the halting problem for a program P of size s given the first
>s bits of Omega:
>Calculate lower bounds for Omega (as mentioned in other posts) until
>the first s bits agree with the known bits. Now you have either found
>that program P halts, or you know that it halting would raise the
>lower bound so much that it could no longer match the known bits, so P
>cannot halt.
>
>Exercise: Show that the first bit of Omega whose value is independent
>of ZFC is a 0.

Obvious nonsense. No wait, not nonsense. Trivial. No, there's a
little error there... Is there a due date?

>Uoti Urpala



David C. Ullrich
*********************
"Sometimes you can have access violations all the 
time and the program still works." (Michael Caracena, 
comp.lang.pascal.delphi.misc 5/1/01)



More information about the Python-list mailing list