Time we switched to unicode? (was Explanation of this Python language feature?)

Antoon Pardon antoon.pardon at rece.vub.ac.be
Tue Mar 25 11:38:38 CET 2014


On 25-03-14 10:54, Chris Angelico wrote:
> On Tue, Mar 25, 2014 at 8:43 PM, Antoon Pardon
> <antoon.pardon at rece.vub.ac.be> wrote:
>> I thought programs were read more than written. So if writing is made
>> a bit more problematic but the result is more readable because we are
>> able to use symbols that are already familiar from other contexts, I
>> would say it is worth it.
> It's a matter of extents. If code is read ten times for every time
> it's written, making it twenty times harder to write and a little bit
> easier to read is still a bad tradeoff.
>
> Also: To what extent IS that symbol familiar from some other context?
> Are you using Python as a programming language, or should you perhaps
> be using a mathematical front-end? Not everything needs to perfectly
> match what anyone from any other context will expect. This is, first
> and foremost, a *programming* language.

So? We do use + -, so why shouldn't we use × for multiplication. Would
such a use already indicate I should use a mathematical front-end?

When a programming language is borrowing concepts from mathematics,
I see no reason not to borrow the symbols used too.

-- 
Antoon Pardon. 




More information about the Python-list mailing list