[Python-ideas] Proposal for default character representation

Mikhail V mikhailwas at gmail.com
Wed Oct 12 19:13:12 EDT 2016


On 12 October 2016 at 23:50, Thomas Nyberg <tomuxiong at gmail.com> wrote:
> Since when was decimal notation "standard"?
Depends on what planet do you live. I live on planet Earth. And you?

> opposite. For unicode representations, byte notation seems standard.
How does this make it a good idea?
Consider unicode table as an array with glyphs.
Now the index of the array is suddenly represented in some
obscure character set. How this index is other than index of any
array or natural number? Think about it...

>> 2. Mixing of two notations (hex and decimal) is a _very_ bad idea,
>> I hope no need to explain why.
>
> Still not sure which "mixing" you refer to.

Still not sure? These two words in brackets. Mixing those two systems.


More information about the Python-ideas mailing list