[Tutor] Defining "bit" type -- why not '!' ?

Alan Gauld alan.gauld at btinternet.com
Thu Jan 29 10:47:22 CET 2009


"Andre Engels" <andreengels at gmail.com> wrote

>> Why not '!' for not, instead of '~'? I mean, '!' is used in logic,
> in many languages and even in python (!=). On the other hand,
> I had never encountered '~' meaning not.
>
> Although ! is indeed usual in computer languages, I disagree when 
> you
> say it is used in logic. There to my knowledge the standard is ¬, 
> with
> ~ being used if one wants to remain within easily-accessible 
> character
> sets.

And both statements are correct. The symbols directly trace back
to C where ! meant logical not and ~ meant bitwise not. The latter
symbol being chosen because it was sometimes used in maths
for not... (Why they chose ! for logical not I don't know!)

Python (and many other languages since) simply passes on its C 
heritage


-- 
Alan Gauld
Author of the Learn to Program web site
http://www.freenetpages.co.uk/hp/alan.gauld 




More information about the Tutor mailing list