On 10/12/2016 07:13 PM, Mikhail V wrote:
On 12 October 2016 at 23:50, Thomas Nyberg firstname.lastname@example.org wrote:
Since when was decimal notation "standard"?
Depends on what planet do you live. I live on planet Earth. And you?
If you mean that decimal notation is the standard used for _counting_ by people, then yes of course that is standard. But decimal notation certainly is not standard in this domain.
opposite. For unicode representations, byte notation seems standard.
How does this make it a good idea? Consider unicode table as an array with glyphs. Now the index of the array is suddenly represented in some obscure character set. How this index is other than index of any array or natural number? Think about it...
Hexadecimal notation is hardly "obscure", but yes I understand that fewer people understand it than decimal notation. Regardless, byte notation seems standard for unicode and unless you can convince the unicode community at large to switch, I don't think it makes any sense for python to switch. Sometimes it's better to go with the flow even if you don't want to.
- Mixing of two notations (hex and decimal) is a _very_ bad idea,
I hope no need to explain why.
Still not sure which "mixing" you refer to.
Still not sure? These two words in brackets. Mixing those two systems.
There is not mixing for unicode in python; it displays as hexadecimal. Decimal is used in other places though. So if by "mixing" you mean python should not use the standard notations of subdomains when working with those domains, then I would totally disagree. The language used in different disciplines is and has always been variable. Until that's no longer true it's better to stick with convention than add inconsistency which will be much more confusing in the long-term than learning the idiosyncrasies of a specific domain (in this case the use of hexadecimal in the unicode world).