On Nov 10, 2019, at 20:50, Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:

> This has nothing to do with representation or input via text

It does, it's an extension of the reality that, after so many decades, we are still typing words on a text editor. 

And how else would you want to enter code?

APL is words and symbols in a text editor.

Python, C++, JS, Haskell, Mathematica, Julia, etc. are also words and symbols in a text editor. The only difference is a couple dozen fewer symbols, and I’m not sure why you think that makes a transformative difference.

Meanwhile, unlike APL, some of these languages have options like Jupyter notebooks and similar tools that allow you to organize that text into cells and paste images between the cells or generate them from your code or even include inline live displays, but apparently that doesn’t impress you at all.

You’ve agreed graphical programming languages where you connect up components by drawing lines between them are useless for general purpose.

So what exactly are you suggesting we should have instead of text?

And in what way is experience with APL relevant to it?

In other words, my comment isn't so much about the mechanics and editors that are available as much as the fact that the way we communicate and define the computational solution of problems (be it to other humans or the machine that will execute the instructions) is through typing text into some kind of a text editor. 

When I say "text" I mean "a through z, numbers and a few symbols that were found on mechanical typewriters in the 1960's".  My shorthand for that is ASCII, which isn't quite accurate in that the set symbols contained in the sets where the most significant bits are "000" and "001" (7 bit ASCII) are not used other than CR, LF and HT. 

Right, most programming languages make do with 80-odd characters, while APL uses about 100. Most of the extras being variations on letters.

Although actually most languages—including Python, but not including APL—let you use a few thousand other characters for your function names and other identifiers. But apparently that isn’t interesting to you, it’s only those few dozen extra characters being used as builtins that matters.

So, why?

So, for the most part, programming, for the last 60 years or so --over half a century-- has been limited to the characters and symbols found on a 60 year old typewriter.

And adding another shift key to add one more bank of a couple dozen makes a difference how?

And if you want something that can input thousands of characters… well, what would that look like? Have you used CJK keyboards? They don’t have thousands of keys, because nobody could use that with human fingers. Instead, either you have a bunch of extra shifts, or you enter effectively two letters and a number for each character. That’s not any more expressive, it’s slower and clumsier.

As I have mentioned in another comment, having had this experience, I fully understand how people who do not have the benefit of having communicated with computers, not just symbolically, but through a very different paradigm as well, simply cannot see what I am describing.  It's hard to find an analogy that can easily represent this without some common shared perspective.  I found that music can be that tool.  Of course, that requires classical training at a level sufficient enough to, for example, read and "see" the music when presented with a score.

You keep bringing up music as a comparison, but music notation has far fewer characters, and they’ve been unchanged for even longer than text punctuation. The advantage of music is a 2D notation, not more characters.

And the disadvantage of music notation is the same disadvantage of everything besides text: nobody’s come up with a nice way of entering it that’s even remotely smooth enough that it doesn’t force you to think about the editor instead of the music. When I want to generate notation, I don’t use a notation editor, I play something into a sequencer, edit it in the piano roll interface, and then convert to notation and tweak a few last things, because nothing else is usable. And if I want to do it on my phone, I just can’t do it at all.

Just like math, where it’s easier to read notation because it’s 2D, but the best way to create that notation is to type in code or AsciiMath or TeX and render that to an equation. 

I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition.  I'm already regretting having started this thread, simply because I know how this works.  Frankly, it's almost like trying to engage with a religious person while trying to discuss the lack of evidence for the existence of supernatural beings.

No, what you’re trying to do is engage with a Christian by telling him that his silly 2000-year-old trinity of father, son, and holy spirit can’t possibly encompass the wonders of the universe, which you know because you worship a 1500-year-old true quadrinity of father, son, second cousin, and holy spirit.

All of the wonders of APL (that Python, Haskell, Julia, etc., and especially J, can never approach) come down to a few dozen extra characters that you can type on a keyboard from the 1970s (but not much later) instead of one from the 1960s or today.

It’s not that we can’t comprehend what you’re talking about, it’s that we can’t believe you seriously think this is a big deal, that somehow those few dozen extra characters mean you’re doing a whole different kind of programming.

You‘re talking about weaker versions of the exact same paradigms we already have—array-based and Iterator-based and higher-order programming as found in Python and Haskell and Julia are much more powerful than the limited versions in APL, and more extensible. Sure, I spell a range with 1..20 while in APL you spell it with a variant iota character that I don’t know how to type, but why is that better? The Haskell notation is more intuitive to read, and more like mathematical notation, and extends to different starts and steps and even infinite length, and easier to type. What makes that not “true notation”?