On Thu, Nov 7, 2019 at 4:05 AM Martin Euredjian via Python-ideas firstname.lastname@example.org wrote:
I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
At this point, you've solidly established the need for notation. Yes, I think we all agree; in fact, programming *in general* is a matter of finding a notation to represent various concepts, and then using that notation to express more complex concepts.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
But this does not follow. English, as a language, is almost entirely representable within ASCII, and we don't hear people saying that they can't express their thoughts adequately without introducing "ő" and "火"; people just use more letters. There's no fundamental reason that Python is unable to express the concept of "assignment" without reaching for additional characters.
APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas.
Please, explain to me how much better Python would be if we used "≤" instead of "<=". If I'm reading something like "if x <= y: ...", I read the two-character symbol "<=" as a single symbol.
Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
The trouble with an analogy to music is that it would take a LOT more than 2-3 ASCII characters to represent a short section of musical score. A closer analogy would be mathematics, where the typical blackboard-friendly notation contrasts with the way that a programming language would represent it. The problem in mathematical notation is that there simply aren't enough small symbols available, so they have to keep getting reused (Greek letters in particular end up getting a lot of different meanings).
When your notation is built on an expectation of a two-dimensional sketching style, it makes a lot of sense to write a continued fraction with lots of long bars and then a diagonal "..." at the end, or to write an infinite sum with a big sigma at the beginning and some small numbers around it to show what you're summing from and to, etc, etc. When your notation is built on the expectation of a keyboard and lines of text, it makes just as much sense to write things in a way that works well on that keyboard.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
What if I use some other APL interpreter? What if I want to enter that symbol into my text editor? What if I'm typing code into an email? Can I use Alt-[ in all those contexts?
The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.
Hmm, I'd say that as many programming notations are rooted in algebra as in English, but sure, a lot are rooted in English. But that still doesn't explain why your fancy arrow is better than ":=", since neither one is more rooted in a single language.
Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
So what you're saying is that, instead of introducing a new operator ":=", the language should have introduced a new operator "←", because it's better to have two assignment operators than ... wait, I'm lost. What ARE you saying, exactly? You wish Python had pushed for non-ASCII operators because.... it would theoretically mean it could drop support for the ASCII operators? Because there's no way that's going to happen any time soon.
I used to program a lot in REXX. It supported boolean negation using the "¬" operator. In my entire career as a REXX programmer, I never once saw that outside of contrived examples in documentation; literally every single program ever written used the equally-valid "" operator, because we can all type that one. The untypable operator might as well not even exist.