[Python-ideas] Making stars optional? (was: Making colons optional?)

Steven D'Aprano steve at pearwood.info
Thu Feb 5 23:03:42 CET 2009


Bruce Leban wrote:
> In algebra, you don't have to put a multiplication sign in between two 
> quantities that you want to multiply. I've seen beginning programmers 
> write things like
> 
>     x = 3a + 4(b-c)
> 
> instead of
> 
>     x = 3*a + 4*(b-c)
> 
> Why should we require the stars when it's unambiguous what the first 
> statement means?

Because it's never unambiguous.

If I write x = 3a, does that mean that I've accidentally left out the + 
sign I intended, or that it is a multiplication?

Worse, if I write x = a3, is that an assignment of a*3 or a variable 
named "a3"? Similar for x = ab. The tradition in mathematics is to use 
one letter variable names, so a variable called "ab" is so rare as to be 
virtually non-existent, but this obviously doesn't hold for programming.

Mathematicians get away with this sort of ambiguity because they are 
writing for other mathematicians, not for a computer. Because 
mathematical proofs rely on a sequence of equations, not just a single 
statement, the ambiguity can be resolved:

y = a(b+c) - ac  # does this mean a+() or a*() or something else?
y = ab + ac - ac  # ah, it must have been a*()
y = ab



-- 
Steven




More information about the Python-ideas mailing list