On Thu, Nov 7, 2019 at 12:35 PM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Another example of ASCII soup is regex. That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read
That's it! You got it! The difference is that regex looks like )(*&)(*&)(*^)(*&^ which means nothing. Your brain has a mapping for what these symbols mean already. Ascribing new meaning to a new mish-mash of them breaks that mental mapping and model, which means that it requires 50 or 100 times the cognitive load to process and comprehend. You are forced to keep a truly unnatural mental stack in your head as you parse these infernal combinations of seemingly random ASCII to figure out their meaning.
Notation changes that, if anything for one simple reason: It establishes new patterns, with punctuation and rhythm and your brain can grok that. Don't forget that our brains have evolved amazing pattern matching capabilities, symbols, notation, take advantage of that, hence the deep and wide history of humanity using symbols to communicate. Symbols are everywhere, from the icons on your computer and phone to the dashboard of your car, signs on the road, math, music, etc.
The asterisk is commonly interpreted to mean multiplication:
3 * 5 15 "abc" * 4 'abcabcabcabc'
In a regex, it has broadly the same meaning. It allows any number of what came before it. That's broadly similar to multiplication. How is that somehow "not notation", yet you can define arbitrary symbols to have arbitrary meanings and it is "notation"? What's the distinction? ChrisA