[Tutor] Defining "bit" type -- why not '!' ?

A.T.Hofkamp a.t.hofkamp at tue.nl
Thu Jan 29 10:45:12 CET 2009


spir wrote:
> Le Tue, 27 Jan 2009 11:26:06 -0200,
> Ricardo Aráoz <ricaraoz at gmail.com> a écrit :
>> Operation 	Result 	Notes
>> |x | y| 	bitwise /or/ of x and y 	
>> |x ^ y| 	bitwise /exclusive or/ of x and y 	
>> |x & y| 	bitwise /and/ of x and y 	
>> |x << n| 	x shifted left by n bits 	(1), (2)
>> |x >> n| 	x shifted right by n bits 	(1), (3)
>> |~x| 	the bits of x inverted 	
> 
> Why not '!' for not, instead of '~'? I mean, '!' is used in logic, in many
languages and even in python (!=). On the other hand, I had never encountered
'~' meaning not.

Watch out here, the above operations work on integer values, not on single 
bits. In that context, 'not' and '~' are two different operations.

'not v' inverts the logical value of v (that is, it computes 'not (v != 0)'). 
'~v' on the other hand, swaps all bits of the integer value.

print ~1    # gives '-2' as result
print not 1 # gives 'False' as result


Sincerely,
Albert



More information about the Tutor mailing list