[Python-ideas] Is there a good reason to use * for multiplication?

Bruce Leban bruce at leapyear.org
Sat Oct 13 06:20:30 CEST 2012


Well, I learned x as a multiplication symbol long before I learned either ·
or *, and in many fonts you can barely see the middle dot. Is there a good
reason, we can't just write foo x bar instead of foo * bar? If that's
confusing we could use × instead. No one would ever confuse × and x.

Or for that matter how about (~R∊R∘.×R)/R←1↓⍳R

Seriously: learning that * means multiplication is a very small thing. You
also need to learn what /, // and % do, and the difference between 'and'
and &, and between =, ==, != and /=.

--- Bruce



On Fri, Oct 12, 2012 at 7:41 PM, Steven D'Aprano <steve at pearwood.info>wrote:

> On 13/10/12 07:27, Ram Rachum wrote:
>
>> Hi everybody,
>>
>> Today a funny thought occurred to me. Ever since I've learned to program
>> when I was a child, I've taken for granted that when programming, the sign
>> used for multiplication is *. But now that I think about it, why? Now that
>> we have Unicode, why not use · ?
>>
> t
> 25 or so years ago, I used to do some programming in Apple's Hypertalk
> language, which accepted ÷ in place of / for division. The use of two
> symbols for the same operation didn't cause any problem for users. But then
> Apple had the advantage that there was a single, system-wide, highly
> discoverable way of typing non-ASCII characters at the keyboard, and Apple
> users tended to pride themselves for using them.
>
> I'm not entirely sure about MIDDLE DOT though: especially in small font
> sizes,
> it falls foul of the design principle:
>
> "syntax should not look like a speck of dust on Tim's monitor"
>
> (paraphrasing... can anyone locate the original quote?)
>
> and may be too easily confused with FULL STOP. Another problem is that
> MIDDLE
> DOT is currently valid in identifiers, so that a·b would count as a single
> name. Fixing this would require some fairly heavy lifting (a period of
> deprecation and warnings for any identifier using MIDDLE DOT) before
> introducing it as an operator. So that's a lot of effort for very little
> gain.
>
> If I were designing a language from scratch today, with full Unicode
> support
> from the beginning, I would support a rich set of operators possibly even
> including MIDDLE DOT and × MULTIPLICATION SIGN, and leave it up to the user
> to use them wisely or not at all. But I don't think it would be appropriate
> for Python to add them, at least not before Python 4: too much effort for
> too
> little gain. Maybe in another ten years people will be less resistant to
> Unicode operators.
>
>
>
> [...]
>
>  ·. People on Linux can type Alt-. .
>>
>
> For what it is worth, I'm using Linux and that does not work for me. I am
> yet to find a decent method of entering non-ASCII characters.
>
>
>
> --
> Steven
>
> ______________________________**_________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/**mailman/listinfo/python-ideas<http://mail.python.org/mailman/listinfo/python-ideas>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20121012/c42abe1b/attachment.html>


More information about the Python-ideas mailing list