Obsolesence of <> (fwd)
Nick Perkins
nperkins7 at home.com
Wed Jun 6 05:33:42 CEST 2001
"Jonathan Gardner" <gardner at cardomain.com> wrote in message news:9fjgjl$rp0
> But, in the end we are limited by ASCII. Hopefully, when everyone uses
> unicode editors, and when they figure out how to get their keyboard to
work
> properly, we can adopt a more mathematical terminology, and include
several
> more operators that aren't in ASCII (like the dot for dot product). In the
> end, shouldn't we try to make what is written in the program as legible to
> mathematicians as possible? I mean, the closer the language stays to math,
> the more logical it will be.
Speaking of mathematical notation...
I have been thinking for years that mathematicians should start to use
notations that are more writable (or writable at all), in ASCII. Computer
languages can express exactly the same terms that mathematicians would write
with their seemingly endless 'special characters', greek letters,
sub-scripts, sub-sub scripts, things written over and under other things,
etc,etc.
Is there in fact any standard way of converting something from 'mathematical
notation' to somehting in ASCII? I guess there is one for every computer
language, but is there one for human consumption? When two mathematicians
exchange e-mail, how do they write a formula that would otherwise use such
'mathematical notation'?
More information about the Python-list
mailing list