On 16 October 2016 at 02:58, Greg Ewing firstname.lastname@example.org wrote:
even if it is assembler or whatever, it can be made readable without much effort.
You seem to be focused on a very narrow aspect of readability, i.e. fine details of individual character glyphs. That's not what we mean when we talk about readability of programs.
In this discussion yes, but layout aspects can be also improved and I suppose special purpose of language does not always dictate the layout of code, it is up to you who can define that also. And glyphs is not very narrow aspect, it is one of the fundamental aspects. Also it is much harder to develop than good layout, note that.
That is because that person from beginning (blindly) follows the convention.
What you seem to be missing is that there are *reasons* for those conventions. They were not arbitrary choices.
Exactly, and in case of hex notation I fail to see how my proposal with using letters instead of what we have now, could be overseen at the time of decision. There must *very* solid reason for digits+letters against my variant, wonder what is it. Hope not that mono-width reason. And basic readability principles is somewhat that was clear for people 2000 years ago already.
So, if anything, *you're* the one who is "blindly following tradition" by wanting to use base 10.
Yes because when I was a child I learned it everywhere for everything, others too. As said I don't defend usage of base-10 as you can already note from my posts.
- Better option would be to choose letters and
possibly other glyphs to build up a more readable set. E.g. drop "c" letter and leave "e" due to their optical collision, drop some other weak glyphs, like "l" "h". That is of course would raise many further questions, like why you do you drop this glyph and not this and so on so it will surely end up in quarrel.
Well, that's the thing. If there were large, objective, easily measurable differences between different possible sets of glyphs, then there would be no room for such arguments.
Those things cannot be easiliy measured, if at all, it requires a lot of tests and huge amount of time, you cannot plug measure device to the brain to precisely measure the load. In this case the only choice is to trust most experienced people who show the results which worked for them better and try self to implement and compare. Not saying you have special reason to trust me personally.
The fact that you anticipate such arguments suggests that any differences are likely to be small, hard to measure and/or subjective.
But I can bravely claim that it is better than *any* hex notation, it just follows from what I have here on paper on my table,
I think "on paper" is the important thing here. I suspect you are looking at the published results from some study or other and greatly overestimating the size of the effects compared to other considerations.
If you try to google that particular topic you'll see that there is zero related published material, there are tons of papers on readability, but zero concrete proposals or any attempts to develop something real. That is the thing. I would look in results if there was something. In my case I am looking at what I've achieved during years of my work on it and indeed there some interesting things there. Not that I am overestimating the role of it, but indeed it can really help in many cases, e.g, like in my example with bitstrings. Last but not the least, I am not a "paper ass" in any case, I try to keep only experimantal work where possible.