On 10/12/2016 05:33 PM, Mikhail V wrote:
Hello! New to this list so not sure if I can reply here... :)
Now printing it we get:
By "printing it", do you mean "this is the string representation"? I would presume printing it would show characters nicely rendered. Does it not for you?
and similarly for other cases where raw bytes must be printed/inputed So to summarize: make the decimal notation standard for all cases. I am not going to go deeper, such as what digit amount (leading zeros) to use, since it's quite secondary decision.
Since when was decimal notation "standard"? It seems to be quite the opposite. For unicode representations, byte notation seems standard.
- Hex notation is hardly readable. It was not designed with readability
in mind, so for reading it is not appropriate system, at least with the current character set, which is a mix of digits and letters (curious who was that wize person who invented such a set?).
This is an opinion. I should clarify that for many cases I personally find byte notation much simpler. In this case, I view it as a toss up though for something like utf8-encoded text I would had it if I saw decimal numbers and not bytes.
- Mixing of two notations (hex and decimal) is a _very_ bad idea,
I hope no need to explain why.
Still not sure which "mixing" you refer to.
So that's it, in short. Feel free to discuss and comment.