Martin v. Loewis
Fri, 7 Jul 2000 22:24:03 +0200
>> This sounds very bad. I thought we agreed that attempting to
>> compare (or add) a Unicode string and an 8-bit string containing
>> non-ASCII characters (as in your example) should raise an exception.
>Only if the default encoding is ASCII.
That is not my understanding of the agreement, and I think the whole
idea of a default encoding is a stupid one. I strongly suggest that
the behaviour that Ping just explained and that Guido had suggested
originally is implemented: Mixing unicode and string objects is only
allowed if the string objects are ASCII. Everything else (including a
"default" encoding of some kind) is an application issue.
Otherwise, you'll just go on and ask questions endlessly how things
should behave, and change the interpreter forwards and backwards.
The meaning of mixing both kinds of objects is very easy to explain,
and also allows for efficient implementation - which you won't get if
you have to invoke a character conversion just to get the hash value
of a unicode value (!)