[Python-ideas] Proposal for default character representation

Greg Ewing greg.ewing at canterbury.ac.nz
Sat Oct 15 20:58:08 EDT 2016


Mikhail V wrote:
> Also I can only hard imagine that special purpose
> of some language can ignore readability,

Readability is not something absolute that stands on its
own. It depends a great deal on what is being expressed.

> even if it is assembler or whatever,
> it can be made readable without much effort.

You seem to be focused on a very narrow aspect of
readability, i.e. fine details of individual character
glyphs. That's not what we mean when we talk about
readability of programs.

> So I just look for some other solution for same task,
> let it be 10 times more code.

Then it will take you 10 times longer to write, and
will on average contain 10 times as many bugs. Is
that really worth some small, probably mostly
theoretical advantage at the typographical level?

> That is because that person from beginning
> (blindly) follows the convention.

What you seem to be missing is that there are
*reasons* for those conventions. They were not
arbitrary choices.

Ultimately they can be traced back to the fact that
our computers are built from two-state electronic
devices. That's definitely not an arbitrary choice --
there are excellent physical reasons for it.

Base 10, on the other hand, *is* an arbitrary
choice. Due to an accident of evolution, we ended
up with 10 easily accessible appendages for counting
on, and that made its way into the counting system
that is currently the most widely used by everyday
people.

So, if anything, *you're* the one who is "blindly
following tradition" by wanting to use base 10.

 > 2. Better option would be to choose letters and
> possibly other glyphs to build up a more readable
> set. E.g. drop "c" letter and leave "e" due to
> their optical collision, drop some other weak glyphs,
> like "l" "h". That is of course would raise
> many further questions, like why you do you drop this
> glyph and not this and so on so it will surely end up in quarrel.

Well, that's the thing. If there were large, objective,
easily measurable differences between different possible
sets of glyphs, then there would be no room for such
arguments.

The fact that you anticipate such arguments suggests
that any differences are likely to be small, hard
to measure and/or subjective.

> But I can bravely claim that it is better than *any*
> hex notation, it just follows from what I have here
> on paper on my table,

I think "on paper" is the important thing here. I
suspect you are looking at the published results from
some study or other and greatly overestimating the
size of the effects compared to other considerations.

-- 
Greg



More information about the Python-ideas mailing list