how can I convert invalid ASCII string to Unicode?

Tim Peters at
Wed May 9 04:09:54 EDT 2001

> otoh-7-bit-ascii-was-good-enough-for-god-to-write-the-bible-ly y'rs

> surely he wrote that before 1968?
> are you sure he didn't use 5-bit baudot encoding (iirc, the
> seventh bit used to be sanctified, before that hollerith guy
> messed things up)

God isn't limited by your Swedish notion of time, /F!  1968 is just another
integer to Him, and since it's outside the range of the 7-bit ASCII ordinals
He invented, not even an interesting one.  Indeed, Kronecker amended his
famous "God created the integers, all else is the work of man" on his
deathbed, to "God created the integers, starting at 0 up to but not including
128, all else is the work of ...".  He died before completing it.  Historians
still debate whether he would have amended "man" to "standards committees,
among other servants of Satan".

    range(128)"-error-msg-ly y'rs  - tim

More information about the Python-list mailing list