[Tutor] Defining "bit" type -- why not '!' ?
lie.1296 at gmail.com
Fri Jan 30 07:09:08 CET 2009
On Thu, 29 Jan 2009 12:22:29 +0000, Alan Gauld wrote:
> "spir" <denis.spir at free.fr> wrote
>> Here is an overal and a trial to introduce my view on this topic.
>> * The mix of "extended logic" on non-logical types and treating
>> as bit sequences provakes a kind of conceptual collision.
>> * As a solution, bitwise operations may apply only to a type (byte or
>> on which "extended logic" raises an TypeError.
> You are probably correct but it would break a ton of working code!
> Especially when you recall that bool types were only introduced
> relatively recently so a lot of old code relies on the fact that
> True/False were until then literally 1/0.
> Maybe it is "improved" in Python 3. As you say the ~ operator makes most
> sense if applied to bytes only. But for compatibility reasons I suspect
> they have kept it as-is...
I think it's an unnecessary complication. In programming we have to
remember that the decimal numbers we see are really just a user-friendly
view of the bit patterns used in the CPU. These bit patterns are used in
arithmetic and bit-wise operations (logical operation is a much higher
concept, AFAIK no CPU implements hardware logical operation).
More information about the Tutor