Re: [Python-ideas] Is there a good reason to use * for multiplication?
On Fri, Oct 12, 2012 at 10:40 PM, Blake Hyde
Is anything gained from this addition?
To give a practical answer, I could say that for newbies it's one small confusion that could removed from the language. You and I have been programming for a long time so we take it for granted that * means multiplication, but for any other person that's just another weird idiosyncrasy that further alienates programming. Also, I think that using * for multiplication is ugly.
On Fri, Oct 12, 2012 at 4:37 PM, Ram Rachum
wrote: On Fri, Oct 12, 2012 at 10:34 PM, Mike Graham
On Fri, Oct 12, 2012 at 4:27 PM, Ram Rachum
wrote:
Hi everybody,
Today a funny thought occurred to me. Ever since I've learned to
wrote: program
when I was a child, I've taken for granted that when programming, the sign used for multiplication is *. But now that I think about it, why? Now that we have Unicode, why not use · ?
Do you think that we can make Python support · in addition to *?
I can think of a couple of problems, but none of them seem like deal-breakers:
- Backward compatibility: Python already uses *, but I don't see a backward compatibility problem with supporting · additionally. Let people use whichever they want, like spaces and tabs. - Input methods: I personally use an IDE that could be easily set to automatically convert * to · where appropriate and to allow manual input of ·. People on Linux can type Alt-. . Anyone else can set up a script that'll let them type · using whichever keyboard combination they want. I admit this is pretty annoying, but since you can always use * if you want to, I figure that anyone who cares enough about using · instead of * (I bet that people in scientific computing would like that) would be willing to take the time to set it up.
What do you think?
Ram
Python should not expect characters that are hard for most people to type.
No one will be forced to type it. If you can't type it, use *.
Python should not expect characters that are still hard to display on many common platforms.
We allow people to have unicode variable names, if they wish, don't we? So why not allow them to use unicode operator, if they wish, as a completely optional thing?
I think you'll find strong opposition to adding any non-ASCII characters or characters that don't occur on almost all keyboards as part of the language.
Mike
_______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
On Fri, Oct 12, 2012 at 4:45 PM, Ram Rachum
On Fri, Oct 12, 2012 at 10:40 PM, Blake Hyde
wrote: Is anything gained from this addition?
To give a practical answer, I could say that for newbies it's one small confusion that could removed from the language. You and I have been programming for a long time so we take it for granted that * means multiplication, but for any other person that's just another weird idiosyncrasy that further alienates programming.
Also, I think that using * for multiplication is ugly.
You're emphatically not getting rid of *, though, which means 1) you're only making it harder for new people to learn and deal with, and b) you're at best not eliminating any perceived ugliness, in reality probably compounding it. Mike
Ram Rachum wrote:
I could say that for newbies it's one small confusion that could removed from the language. You and I have been programming for a long time so we take it for granted that * means multiplication, but for any other person that's just another weird idiosyncrasy that further alienates programming.
Do you have any evidence that a substantial number of beginners are confused by * for multiplication, or that they have trouble remembering what it means once they've been told? If you do, is there further evidence that they would find a dot to be any clearer? The use of a raised dot to indicate multiplication of numbers is actually quite rare even in mathematics, and I would not expect anyone without a mathematical background to even be aware of it. In primary school we're taught that 'x' means multiplication. Later when we come to algebra, we're taught not to use any symbol at all, just write things next to each other. A dot is only used in rare cases where there would otherwise be ambiguity -- and even then it's often preferred to parenthesise things instead. And don't forget there's great potential for confusion with the decimal point. -- Greg
On 10/13/2012 07:37 AM, Greg Ewing wrote:
Ram Rachum wrote:
I could say that for newbies it's one small confusion that could removed from the language. You and I have been programming for a long time so we take it for granted that * means multiplication, but for any other person that's just another weird idiosyncrasy that further alienates programming.
Do you have any evidence that a substantial number of beginners are confused by * for multiplication, or that they have trouble remembering what it means once they've been told?
If you do, is there further evidence that they would find a dot to be any clearer?
The use of a raised dot to indicate multiplication of numbers is actually quite rare even in mathematics, and I would not expect anyone without a mathematical background to even be aware of it.
In primary school we're taught that 'x' means multiplication. Later when we come to algebra, we're taught not to use any symbol at all, just write things next to each other. A dot is only used in rare cases where there would otherwise be ambiguity -- and even then it's often preferred to parenthesise things instead.
And don't forget there's great potential for confusion with the decimal point.
I'm -1 on the whole idea. Also why use · and not ×? I think unicode in source code is a bad idea.
Ram Rachum writes:
On Fri, Oct 12, 2012 at 10:40 PM, Blake Hyde
wrote: Is anything gained from this addition?
To give a practical answer, I could say that for newbies it's one small confusion that could removed from the language.
Get Microsoft to agree and implement it in Excel and you might have a point. But as long as Excel uses * for multiplication, I don't think anybody who uses computers is going to have trouble learning this. Anyway, Python believes in TOOWTDI ("the one old way to do it").[1] Footnotes: [1] With apologies to Tim Peters.
participants (5)
-
Greg Ewing
-
Mathias Panzenböck
-
Mike Graham
-
Ram Rachum
-
Stephen J. Turnbull