On Sat, May 3, 2014 at 5:19 PM, Bruce Leban email@example.com wrote:
I've actually written programs like that and honestly names like 'sigma' and 'beta' and 'v_t' worked just fine. Many of us have used (x1, y1) and (x2, y2) without confusing anyone because the digits weren't subscripted.
Yeah; like I said, it's not a big thing. I certainly wouldn't choose a language on the basis of subscript-digit-support-in-identifiers. But when I'm working with maths I'm not overly familiar with (stuff a lot more complicated than simple linear acceleration), and I'm trying to translate a not-quite-perfect set of handwritten scribbles into code, every little bit helps. That's why WYSIWYG music editing software is so much more popular with novices than GNU Lilypond is - if you're not *really* familiar with what you're working with, the difference between "dot on the page that looks like this" and "c'8." slows you down. Not insurmountable but the mind glitches across the gap.