[Tutor] Defining "bit" type -- why not '!' ?

Andre Engels andreengels at gmail.com
Thu Jan 29 10:26:32 CET 2009


On Thu, Jan 29, 2009 at 10:19 AM, spir <denis.spir at free.fr> wrote:

> Why not '!' for not, instead of '~'? I mean, '!' is used in logic, in many languages and even in python (!=). On the other hand, I had never encountered '~' meaning not.

Although ! is indeed usual in computer languages, I disagree when you
say it is used in logic. There to my knowledge the standard is ¬, with
~ being used if one wants to remain within easily-accessible character
sets.


-- 
André Engels, andreengels at gmail.com


More information about the Tutor mailing list