Python should take a lesson from APL: Walrus operator not needed
During a recent HN discussion about the walrus operator I came to realize yet another advantage of notation. I used APL professionally for about ten years, which made it an obvious source of inspiration for an example that, in my opinion, demonstrates why the Python team missed a very valuable opportunity to take this wonderful language and start exploring the judicious introduction of notation as a valuable tool for thought (borrowing from Ken Iverson's APL paper with that title [0]). To simplify, I'll define the desire for this walrus operator ":=" as "wanting to be able to make assignments within syntax where it was previously impossible": if x = 5 # was impossible # and now if x := 5 # makes is possible A more elaborate example given in the PEP goes like this: Current: reductor = dispatch_table.get(cls) if reductor: rv = reductor(x) else: reductor = getattr(x, "__reduce_ex__", None) if reductor: rv = reductor(4) else: reductor = getattr(x, "__reduce__", None) if reductor: rv = reductor() else: raise Error( "un(deep)copyable object of type %s" % cls) Improved: if reductor := dispatch_table.get(cls): rv = reductor(x) elif reductor := getattr(x, "__reduce_ex__", None): rv = reductor(4) elif reductor := getattr(x, "__reduce__", None): rv = reductor() else: raise Error("un(deep)copyable object of type %s" % cls) At first I thought, well, just extend "=" and be done with it. The HN thread resulted in many comments against this idea. The one that made me think was this one [1]: "These two are syntactically equal and in Python there's no way a linter can distinguish between these two: if reductor = dispatch_table.get(cls): if reductor == dispatch_table.get(cls): A human being can only distinguish them through careful inspection. The walrus operator not only prevents that problem, but makes the intent unambiguous." Which is a perfectly valid point. I get it. Still, the idea of two assignment operators just didn't sit well with me. That's when I realized I had seen this kind of a problem nearly thirty years ago, with the introduction of "J". I won't get into the details unless someone is interested, I'll just say that J turned APL into ASCII soup. It was and is ugly and it completely misses the point of the very reason APL has specialized notation; the very thing Iverson highlighted in his paper [0]. Back to Python. This entire mess could have been avoided by making one simple change that would have possibly nudged the language towards a very interesting era, one where a specialized programming notation could be evolved over time for the benefit of all. That simple change would have been the introduction and adoption of APL's own assignment operator: "←" In other words, these two things would have been equivalent in Python: a ← 23 a = 23 What's neat about this is that both human and automated tools (linters, etc.) would have no problem understanding the difference between these: if reductor ← dispatch_table.get(cls): if reductor == dispatch_table.get(cls): And the larger example would become this: if reductor ← dispatch_table.get(cls): rv ← reductor(x) elif reductor ← getattr(x, "__reduce_ex__", None): rv ← reductor(4) elif reductor ← getattr(x, "__reduce__", None): rv ← reductor() else: raise Error("un(deep)copyable object of type %s" % cls) This assignment operator would work everywhere and, for a period of time, the "=" operator would be retained. The good news is that old code could be updated with a simple search-and-replace. In fact, code editors could even display "=" as "←" as an option. The transition to only allowing "←" (and perhaps other symbols) could be planned for Python 4. Clean, simple and forward-looking. That, to me, is a good solution. Today we have "=" and ":=" which, from my opinionated perspective, does not represent progress at all. [0] http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf [1] https://news.ycombinator.com/item?id=21426338
On Wed, Nov 6, 2019 at 3:28 PM martin_05--- via Python-ideas <python-ideas@python.org> wrote:
Back to Python.
This entire mess could have been avoided by making one simple change that would have possibly nudged the language towards a very interesting era, one where a specialized programming notation could be evolved over time for the benefit of all. That simple change would have been the introduction and adoption of APL's own assignment operator: "←"
How do people type that operator? ChrisA
martin_05--- via Python-ideas wrote:
The transition to only allowing "←" (and perhaps other symbols) could be planned for Python 4.
Requiring non-ascii characters in the core language would be a very big change, especially for something as ubiquitous as assignment. Much more justification than just "it looks nicer than the walrus operator" would be required. -- Greg
The arrow ...which I will not copy and paste to really hammer home the point that its not on my fairly standard US keyboard... doesn't look like assignment, it looks like a comparison operator.
-----Original Message----- From: martin_05--- via Python-ideas <python-ideas@python.org> Sent: Tuesday, November 5, 2019 7:47 PM To: python-ideas@python.org Subject: [Python-ideas] Python should take a lesson from APL: Walrus operator not needed
During a recent HN discussion about the walrus operator I came to realize yet another advantage of notation. I used APL professionally for about ten years, which made it an obvious source of inspiration for an example that, in my opinion, demonstrates why the Python team missed a very valuable opportunity to take this wonderful language and start exploring the judicious introduction of notation as a valuable tool for thought (borrowing from Ken Iverson's APL paper with that title [0]).
To simplify, I'll define the desire for this walrus operator ":=" as "wanting to be able to make assignments within syntax where it was previously impossible":
if x = 5 # was impossible
# and now
if x := 5 # makes is possible
A more elaborate example given in the PEP goes like this: Current:
reductor = dispatch_table.get(cls) if reductor: rv = reductor(x) else: reductor = getattr(x, "__reduce_ex__", None) if reductor: rv = reductor(4) else: reductor = getattr(x, "__reduce__", None) if reductor: rv = reductor() else: raise Error( "un(deep)copyable object of type %s" % cls)
Improved:
if reductor := dispatch_table.get(cls): rv = reductor(x) elif reductor := getattr(x, "__reduce_ex__", None): rv = reductor(4) elif reductor := getattr(x, "__reduce__", None): rv = reductor() else: raise Error("un(deep)copyable object of type %s" % cls)
At first I thought, well, just extend "=" and be done with it. The HN thread resulted in many comments against this idea. The one that made me think was this one [1]:
"These two are syntactically equal and in Python there's no way a linter can distinguish between these two:
if reductor = dispatch_table.get(cls): if reductor == dispatch_table.get(cls):
A human being can only distinguish them through careful inspection. The walrus operator not only prevents that problem, but makes the intent unambiguous."
Which is a perfectly valid point. I get it.
Still, the idea of two assignment operators just didn't sit well with me. That's when I realized I had seen this kind of a problem nearly thirty years ago, with the introduction of "J". I won't get into the details unless someone is interested, I'll just say that J turned APL into ASCII soup. It was and is ugly and it completely misses the point of the very reason APL has specialized notation; the very thing Iverson highlighted in his paper [0].
Back to Python.
This entire mess could have been avoided by making one simple change that would have possibly nudged the language towards a very interesting era, one where a specialized programming notation could be evolved over time for the benefit of all. That simple change would have been the introduction and adoption of APL's own assignment operator: "←"
In other words, these two things would have been equivalent in Python:
a ← 23
a = 23
What's neat about this is that both human and automated tools (linters, etc.) would have no problem understanding the difference between these:
if reductor ← dispatch_table.get(cls): if reductor == dispatch_table.get(cls):
And the larger example would become this:
if reductor ← dispatch_table.get(cls): rv ← reductor(x) elif reductor ← getattr(x, "__reduce_ex__", None): rv ← reductor(4) elif reductor ← getattr(x, "__reduce__", None): rv ← reductor() else: raise Error("un(deep)copyable object of type %s" % cls)
This assignment operator would work everywhere and, for a period of time, the "=" operator would be retained. The good news is that old code could be updated with a simple search-and-replace. In fact, code editors could even display "=" as "←" as an option. The transition to only allowing "←" (and perhaps other symbols) could be planned for Python 4.
Clean, simple and forward-looking. That, to me, is a good solution. Today we have "=" and ":=" which, from my opinionated perspective, does not represent progress at all.
[0] http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
[1] https://news.ycombinator.com/item?id=21426338 _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python- ideas@python.org/message/JSA2RUE5SG2XGAUM4GRBRXPZ7Z4O2UYN/ Code of Conduct: http://python.org/psf/codeofconduct/
On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas < python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:
a ← 23
a = 23
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python. Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? Richard
On Wed, Nov 6, 2019 at 6:57 PM Richard Musil <risa2000x@gmail.com> wrote:
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, because "x <-- y" is already legal syntax (it applies unary minus to y twice, then checks whether x is less than the result). ChrisA
On Nov 6, 2019, at 08:59, Chris Angelico <rosuav@gmail.com> wrote:
No, because "x <-- y" is already legal syntax (it applies unary minus to y twice, then checks whether x is less than the result).
You could handle that by making the grammar more complicated. I don’t think it’s a good idea at all, but I think it could be done. Without working through the details, I think all you’d need is a rule that the assignment statement production is tried before the expression production. This would mean that any statement made up of a result-ignoring less-than comparison between an expression simple enough to be a target and a double-negated expression can only be written by putting a space after the < symbol, but that probably affects not a single line of code ever written. But (even assuming I’m right about that), it would mean a new and unique rule that you have to internalize to parse Python in your head, or you’ll stumble every time you see <-. And I don’t think people would internalize it. A big part of the reason Python is readable is that its grammar is simple (compared to anything but Lisp or Smalltalk), and usually obviously unambiguous, to humans, not just to parser programs. Not that Python doesn’t already have a few rules that people don’t completely internalize. (I don’t know exactly the rules for when you can leave parens off a genexpr, a yield, maybe even a tuple; I only know them well enough to write code without pausing, and to read 99.99% of the code anyone writes without pausing, and there probably are constructions that would be legal if anyone ever wrote them that would momentarily throw me for a loop.) But in each case, the advantage is so huge (imagine having to write `x, y = (y, x)` everywhere…) that it’s clearly worth it. In this case, the advantage would be tiny (instead of having to learn that assignment is spelled :=, as in many other languages and non-code contexts, I get to learn that assignment is spelled <—, as in many other languages as non-code contexts?). So it’s definitely not worth it.
Andrew Barnert via Python-ideas wrote:
On Nov 6, 2019, at 08:59, Chris Angelico <rosuav@gmail.com> wrote:
No, because "x <-- y" is already legal syntax
You could handle that by making the grammar more complicated.
Or just have the tokeniser treat "<--" as a single token, the same way that it treats "<=" as a single token rather than "<" followed by "=". It would be a backwards-incompatible change (if you really wanted "less than minus minus something" you'd have to put a space in somewhere) but replacing the assignment operator is already a much bigger one. -- Greg
On Thu, Nov 7, 2019 at 8:47 AM Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Andrew Barnert via Python-ideas wrote:
On Nov 6, 2019, at 08:59, Chris Angelico <rosuav@gmail.com> wrote:
No, because "x <-- y" is already legal syntax
You could handle that by making the grammar more complicated.
Or just have the tokeniser treat "<--" as a single token, the same way that it treats "<=" as a single token rather than "<" followed by "=". It would be a backwards-incompatible change (if you really wanted "less than minus minus something" you'd have to put a space in somewhere) but replacing the assignment operator is already a much bigger one.
To clarify: I wasn't saying that it's fundamentally impossible to have these kinds of parsing rules, but that it's backward incompatible. Notably, even though this syntax is fairly unlikely to come up, it means that anyone using "<--" as an assignment operator will have to worry about older Python versions misinterpreting it. If you create a brand new operator out of something that's currently invalid syntax, then it's easy - you get an instant compilation error on an older interpreter. With this, it might sometimes result in a runtime NameError or TypeError, and even worse, might just silently do the wrong thing. That's why Python 3.9 still won't let you write "except ValueError, IndexError:" - you *have* to parenthesize the tuple, because the comma syntax had a different meaning in Python 2 (the "except Exception as name:" syntax was backported to 2.6/2.7 but the older syntax is of course still valid). There is no way that you can accidentally run your code on the wrong Python and have it silently assign to IndexError instead of catching two types. ChrisA
Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them. The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical. APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas. Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple. The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way. Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it. I encourage everyone to read this Turing Award presentation: "Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf Also, if you haven't seen it, these videos is very much worth watching: Conway's Game of Life in APLhttps://www.youtube.com/watch?v=a9xAKttWgP4 Suduku solver in APLhttps://www.youtube.com/watch?v=DmT80OseAGs -Martin On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil <risa2000x@gmail.com> wrote: On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <python-ideas@python.org> wrote: In other words, these two things would have been equivalent in Python: a ← 23 a = 23 I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python. Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? Richard
It's too late for this one, but I'd be open to allowing Unicode operators. It is always poo-poo'd here but there are numerous input solutions: - I enabled AltGr and a Compose key on my US keyboard, I can type symbols like ©, —, …, ø, ə, with two or three keystrokes, diacritics too: café. I do this often. - Use an ASCII notation, such as >= to ≥, \eAExpression or \eSetUnion, \uXXXX, etc, that are rendered with a tool like "go fmt" or black. - Word processors typically have a Symbols dialog for such occasions. - There are simple websites such as http://unicode-search.net/ for finding obscure symbols. Python has unicodedata, would be simple to wire it up. -Mike P.S. Ligatures are another solution from a different angle, perhaps your favorite editor could show ← for :=, might need a custom font though. On 2019-11-06 09:05, Martin Euredjian via Python-ideas wrote:
Thanks for your feedback. A few comments:
On Thu, Nov 7, 2019 at 4:05 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
At this point, you've solidly established the need for notation. Yes, I think we all agree; in fact, programming *in general* is a matter of finding a notation to represent various concepts, and then using that notation to express more complex concepts.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
But this does not follow. English, as a language, is almost entirely representable within ASCII, and we don't hear people saying that they can't express their thoughts adequately without introducing "ő" and "火"; people just use more letters. There's no fundamental reason that Python is unable to express the concept of "assignment" without reaching for additional characters.
APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas.
Please, explain to me how much better Python would be if we used "≤" instead of "<=". If I'm reading something like "if x <= y: ...", I read the two-character symbol "<=" as a single symbol.
Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
The trouble with an analogy to music is that it would take a LOT more than 2-3 ASCII characters to represent a short section of musical score. A closer analogy would be mathematics, where the typical blackboard-friendly notation contrasts with the way that a programming language would represent it. The problem in mathematical notation is that there simply aren't enough small symbols available, so they have to keep getting reused (Greek letters in particular end up getting a lot of different meanings). When your notation is built on an expectation of a two-dimensional sketching style, it makes a lot of sense to write a continued fraction with lots of long bars and then a diagonal "..." at the end, or to write an infinite sum with a big sigma at the beginning and some small numbers around it to show what you're summing from and to, etc, etc. When your notation is built on the expectation of a keyboard and lines of text, it makes just as much sense to write things in a way that works well on that keyboard.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
What if I use some other APL interpreter? What if I want to enter that symbol into my text editor? What if I'm typing code into an email? Can I use Alt-[ in all those contexts?
The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.
Hmm, I'd say that as many programming notations are rooted in algebra as in English, but sure, a lot are rooted in English. But that still doesn't explain why your fancy arrow is better than ":=", since neither one is more rooted in a single language.
Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
So what you're saying is that, instead of introducing a new operator ":=", the language should have introduced a new operator "←", because it's better to have two assignment operators than ... wait, I'm lost. What ARE you saying, exactly? You wish Python had pushed for non-ASCII operators because.... it would theoretically mean it could drop support for the ASCII operators? Because there's no way that's going to happen any time soon. I used to program a lot in REXX. It supported boolean negation using the "¬" operator. In my entire career as a REXX programmer, I never once saw that outside of contrived examples in documentation; literally every single program ever written used the equally-valid "\" operator, because we can all type that one. The untypable operator might as well not even exist. ChrisA
I used APL professionally for about ten years.
Yes, you've stated that already. None of your objections ring true. A simple example is had from
mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics. A simple example is had from mathematics. The integral symbol conveys and
represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The fact that people learn and then become comfortable with symbols doesn't imply that choosing which symbols to adopt into a language is trivial. You can follow the evolution of languages over time and find that they often eject characters that serve little use or cause confusion like the english character "thorn" <https://en.wikipedia.org/wiki/Thorn_(letter)>. The use of notation as a tool for thought is a powerful concept that
transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
You don't need to convince people of the power of abstraction or the utility of domain-specific languages. Such a general statement doesn't support the adoption of any specific change. You might as well be advocating for adding Egyptian hieroglyphics to musical notation. We don't need a lecture on the importance of abstract notation each time a new syntax is proposed. The future of computing, in my opinion, must move away --perhaps not
entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
I strongly disagree with this. First of all, mathematical notation which programming borrows heavily from, highly favors compaction over clarity. It uses greek and latin symbols that mean different things depending on the field. It uses both left and right super and sub-scripts sometimes for naming conventions, sometimes to denote exponentiation. It uses dots and hats and expressions that sit below and/or above symbols (like in "limit" notation or summations) and all sorts of other orientations and symbol modifications that are almost impossible to look up, infix and prefix and postfix notation. It makes picking up any given mathematical paper a chore to comprehend because so much context is assumed and not readily accessible. Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old. It predates the printing press and other tools that make more explicit and readable notation possible. It was much more important hundreds of years ago, that your ideas be expressible in a super-concise form to the detriment of readability. That's not the only reason, of course, but it is a pretty big reason. I submit that most mathematical papers would benefit from having their formulas re-written in something like a programming language with more explicit variable names and consistent notation. As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?! That's all I have time for, for now, I may pick this up later. On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value
I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas.
Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter " ←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.
Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
I encourage everyone to read this Turing Award presentation:
"Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
Also, if you haven't seen it, these videos is very much worth watching:
Conway's Game of Life in APL https://www.youtube.com/watch?v=a9xAKttWgP4
Suduku solver in APL https://www.youtube.com/watch?v=DmT80OseAGs
-Martin
On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil < risa2000x@gmail.com> wrote:
On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas < python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:
a ← 23
a = 23
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. I understood this just fine. I happen to think your argument in this regard is neither sound nor valid.
Question: Where did APL's "←" operator come from? A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's. In many ways it meant "this name is assigned to this object", to paraphrase your statement. I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"? This is no different from the concept of pointers. A pointer stores an address to some data structure somewhere. No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. It's a pointer. We are assigning a pointer. We are storing an address that points to where the data lives. In fact, one could very well make the argument that using the "=" character for this operation is misleading because the left side is not set to be equal to the right side. Even worse, these pointers in Python are inmutable. Someone coming from a whole range of languages sees the "=" sign to mean something very different. For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language. So, "=" in Python is not equal to "=" in many languages. Why are we using the same symbol and creating this confusion? If your response is something like "people learn the difference", well, you just made my point. People learn. I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally. It has been my experience that people who have not had the experience rarely get it, and, sadly, more often than not, they become hostile to the implication that there might actually be a better way to translate ideas into computer executable code. That's just reality and I am not going to change it. Look, we don't have to agree, and, frankly, you seem to be getting rattled. I want no part of that. I didn't come here to change the Python universe. Like I said, I am nobody, so, yeah, forget it. Don't waste your time on me or my ridiculous ideas. I just wanted to share an opinion, worthless as it might be. Thanks, -Martin On Wednesday, November 6, 2019, 12:18:21 PM PST, Abe Dillon <abedillon@gmail.com> wrote: I used APL professionally for about ten years. Yes, you've stated that already. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective. I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective. The fact that people learn and then become comfortable with symbols doesn't imply that choosing which symbols to adopt into a language is trivial. You can follow the evolution of languages over time and find that they often eject characters that serve little use or cause confusion like the english character "thorn". The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them. You don't need to convince people of the power of abstraction or the utility of domain-specific languages. Such a general statement doesn't support the adoption of any specific change. You might as well be advocating for adding Egyptian hieroglyphics to musical notation. We don't need a lecture on the importance of abstract notation each time a new syntax is proposed. The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical. I strongly disagree with this. First of all, mathematical notation which programming borrows heavily from, highly favors compaction over clarity. It uses greek and latin symbols that mean different things depending on the field. It uses both left and right super and sub-scripts sometimes for naming conventions, sometimes to denote exponentiation. It uses dots and hats and expressions that sit below and/or above symbols (like in "limit" notation or summations) and all sorts of other orientations and symbol modifications that are almost impossible to look up, infix and prefix and postfix notation. It makes picking up any given mathematical paper a chore to comprehend because so much context is assumed and not readily accessible. Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old. It predates the printing press and other tools that make more explicit and readable notation possible. It was much more important hundreds of years ago, that your ideas be expressible in a super-concise form to the detriment of readability. That's not the only reason, of course, but it is a pretty big reason. I submit that most mathematical papers would benefit from having their formulas re-written in something like a programming language with more explicit variable names and consistent notation. As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?! That's all I have time for, for now, I may pick this up later. On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote: Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them. The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical. APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas. Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple. The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way. Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it. I encourage everyone to read this Turing Award presentation: "Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf Also, if you haven't seen it, these videos is very much worth watching: Conway's Game of Life in APLhttps://www.youtube.com/watch?v=a9xAKttWgP4 Suduku solver in APLhttps://www.youtube.com/watch?v=DmT80OseAGs -Martin On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil <risa2000x@gmail.com> wrote: On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <python-ideas@python.org> wrote: In other words, these two things would have been equivalent in Python: a ← 23 a = 23 I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python. Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/
Question: Where did APL's "←" operator come from?
Doesn't matter. If your notation can't stand on its own without a history lesson, then it's not great.
A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's. In many ways it meant "this name is assigned to this object", to paraphrase your statement.
In "many ways"? Not exactly? How does this make it better? It still sounds counterintuitive. If it really means "this name references this object", why not a left arrow? I mean, how does "a = 23" which is read "a is equal to 23" or "a =
some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"?
I would say that it's not perfect notation. I read "a = 23" as "a is 23" in the same way that I would say "Abe is my name". It doesn't describe the relationship well, but it's an acceptable, pragmatic use of familiar notation and it doesn't run *counter to* the actual mechanics of the language. As far as symbols go, an arrow to the left would be closest to representing the mechanics of the language, "=" is a compromise, and "←" is backwards. The object doesn't refer to the name, the name refers to the object. No professional thinks that "a = some_object" results in a bucket being
filled with whatever the object might contain.
That's exactly how variables work in many common languages like C and is actually a common misconception even among people with years of experience with Python. Check out this Stack Overflow question <https://softwareengineering.stackexchange.com/questions/314808/why-variables-in-python-are-different-from-other-programming-languages> that asks what the difference is and the confused comments that follow saying there is no difference or even the fact that the question was closed because people didn't understand what it was asking. It's pretty bonkers to me that professional programers don't know the difference or understand a very basic question, but it's the world we live in. In fact, one could very well make the argument that using the "=" character
for this operation is misleading because the left side is not set to be equal to the right side. Even worse, these pointers in Python are inmutable.
The "=" doesn't imply immutability. That's actually what the walrus operator ":=" implies. "pi := circumference/diameter" means "pi is defined as the ratio of the circumference to the diameter". For the most part, implying that the name equals the object it references is fine. Implying that you're somehow putting the object into the variable or attaching the object to the variable (instead of the other way around) is backwards. Someone coming from a whole range of languages sees the "=" sign to mean
something very different. For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language. So, "=" in Python is not equal to "=" in many languages. Why are we using the same symbol and creating this confusion?
For many many reasons. If your response is something like "people learn the difference", well,
you just made my point. People learn.
That's not sufficient justification. If it were, you could use that same logic to justify adding any symbol to any language. I find musical notation woefully lacking. There's no way to denote clapping or snapping fingers or the glottal stops that Regina Spector is so fond of. Maybe I should add a fish symbol, a duck symbol, and the guy walking sideways from Egyptian Hieroglyphics to the standard musical notation to represent those sounds. People will learn the difference, right? I've had this kind of a conversation with many people in the 30+ years
since I learned APL and 20+ years since I stopped using it professionally.
Oh, really? You programmed APL for 10 years?! Did you go to Yale Mr. Kavanaugh? You can cut the arguments from Authority. They're worth nothing. Look, we don't have to agree, and, frankly, you seem to be getting rattled. I'm genuinely curious what makes you think I'm "rattled"? I'm not. On Wed, Nov 6, 2019 at 2:54 PM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python.
I understood this just fine. I happen to think your argument in this regard is neither sound nor valid.
Question: Where did APL's "←" operator come from?
A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's. In many ways it meant "this name is assigned to this object", to paraphrase your statement.
I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"?
This is no different from the concept of pointers. A pointer stores an address to some data structure somewhere. No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. It's a pointer. We are assigning a pointer. We are storing an address that points to where the data lives.
In fact, one could very well make the argument that using the "=" character for this operation is misleading because the left side is not set to be equal to the right side. Even worse, these pointers in Python are inmutable. Someone coming from a whole range of languages sees the "=" sign to mean something very different. For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language. So, "=" in Python is not equal to "=" in many languages. Why are we using the same symbol and creating this confusion?
If your response is something like "people learn the difference", well, you just made my point. People learn.
I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally. It has been my experience that people who have not had the experience rarely get it, and, sadly, more often than not, they become hostile to the implication that there might actually be a better way to translate ideas into computer executable code. That's just reality and I am not going to change it.
Look, we don't have to agree, and, frankly, you seem to be getting rattled. I want no part of that. I didn't come here to change the Python universe. Like I said, I am nobody, so, yeah, forget it. Don't waste your time on me or my ridiculous ideas. I just wanted to share an opinion, worthless as it might be.
Thanks,
-Martin
On Wednesday, November 6, 2019, 12:18:21 PM PST, Abe Dillon < abedillon@gmail.com> wrote:
I used APL professionally for about ten years.
Yes, you've stated that already.
None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics.
A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The fact that people learn and then become comfortable with symbols doesn't imply that choosing which symbols to adopt into a language is trivial. You can follow the evolution of languages over time and find that they often eject characters that serve little use or cause confusion like the english character "thorn" <https://en.wikipedia.org/wiki/Thorn_(letter)>.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
You don't need to convince people of the power of abstraction or the utility of domain-specific languages. Such a general statement doesn't support the adoption of any specific change. You might as well be advocating for adding Egyptian hieroglyphics to musical notation. We don't need a lecture on the importance of abstract notation each time a new syntax is proposed.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
I strongly disagree with this. First of all, mathematical notation which programming borrows heavily from, highly favors compaction over clarity. It uses greek and latin symbols that mean different things depending on the field. It uses both left and right super and sub-scripts sometimes for naming conventions, sometimes to denote exponentiation. It uses dots and hats and expressions that sit below and/or above symbols (like in "limit" notation or summations) and all sorts of other orientations and symbol modifications that are almost impossible to look up, infix and prefix and postfix notation. It makes picking up any given mathematical paper a chore to comprehend because so much context is assumed and not readily accessible.
Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old. It predates the printing press and other tools that make more explicit and readable notation possible. It was much more important hundreds of years ago, that your ideas be expressible in a super-concise form to the detriment of readability. That's not the only reason, of course, but it is a pretty big reason. I submit that most mathematical papers would benefit from having their formulas re-written in something like a programming language with more explicit variable names and consistent notation.
As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?!
That's all I have time for, for now, I may pick this up later.
On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value
I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas.
Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter " ←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.
Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
I encourage everyone to read this Turing Award presentation:
"Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
Also, if you haven't seen it, these videos is very much worth watching:
Conway's Game of Life in APL https://www.youtube.com/watch?v=a9xAKttWgP4
Suduku solver in APL https://www.youtube.com/watch?v=DmT80OseAGs
-Martin
On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil < risa2000x@gmail.com> wrote:
On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas < python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:
a ← 23
a = 23
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/
_______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/CHGSPX... Code of Conduct: http://python.org/psf/codeofconduct/
No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. > That's exactly how variables work in many common languages like C Nope, not true. Bytes, words, integers and that's about it. Anything else is a pointer to a relevant data structure somewhere in memory. I think I can say this it true of the vast majority of languages. Exceptions are cases like Python and, say, Objective-C, or, in general, where the philosophy is that everything is an object. Nobody is filling buckets with anything every time there's an assignment, at least no language I have ever used. At the end of the day, it's a pointer to a chunk-o-memory with a header describing what's in there, how many, etc. In more complex cases it's a pointer to a linked list or a pointer to a chunk of memory filled with pointers to other chunks of memory. This has been the case from almost the beginning of time. Let's put it this way, I was doing this kind of thin when I was programming IMSAI's with toggle switches and keeping track of variable names, memory locations and contents on a notebook by hand, paper and pencil.
-Martin On Wednesday, November 6, 2019, 02:01:27 PM PST, Abe Dillon <abedillon@gmail.com> wrote: Question: Where did APL's "←" operator come from? Doesn't matter. If your notation can't stand on its own without a history lesson, then it's not great. A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's. In many ways it meant "this name is assigned to this object", to paraphrase your statement. In "many ways"? Not exactly? How does this make it better? It still sounds counterintuitive. If it really means "this name references this object", why not a left arrow? I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"? I would say that it's not perfect notation. I read "a = 23" as "a is 23" in the same way that I would say "Abe is my name". It doesn't describe the relationship well, but it's an acceptable, pragmatic use of familiar notation and it doesn't run *counter to* the actual mechanics of the language. As far as symbols go, an arrow to the left would be closest to representing the mechanics of the language, "=" is a compromise, and "←" is backwards. The object doesn't refer to the name, the name refers to the object. No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. That's exactly how variables work in many common languages like C and is actually a common misconception even among people with years of experience with Python. Check out this Stack Overflow question that asks what the difference is and the confused comments that follow saying there is no difference or even the fact that the question was closed because people didn't understand what it was asking. It's pretty bonkers to me that professional programers don't know the difference or understand a very basic question, but it's the world we live in. In fact, one could very well make the argument that using the "=" character for this operation is misleading because the left side is not set to be equal to the right side. Even worse, these pointers in Python are inmutable. The "=" doesn't imply immutability. That's actually what the walrus operator ":=" implies. "pi := circumference/diameter" means "pi is defined as the ratio of the circumference to the diameter". For the most part, implying that the name equals the object it references is fine. Implying that you're somehow putting the object into the variable or attaching the object to the variable (instead of the other way around) is backwards. Someone coming from a whole range of languages sees the "=" sign to mean something very different. For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language. So, "=" in Python is not equal to "=" in many languages. Why are we using the same symbol and creating this confusion? For many many reasons. If your response is something like "people learn the difference", well, you just made my point. People learn. That's not sufficient justification. If it were, you could use that same logic to justify adding any symbol to any language. I find musical notation woefully lacking. There's no way to denote clapping or snapping fingers or the glottal stops that Regina Spector is so fond of. Maybe I should add a fish symbol, a duck symbol, and the guy walking sideways from Egyptian Hieroglyphics to the standard musical notation to represent those sounds. People will learn the difference, right? I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally. Oh, really? You programmed APL for 10 years?! Did you go to Yale Mr. Kavanaugh? You can cut the arguments from Authority. They're worth nothing. Look, we don't have to agree, and, frankly, you seem to be getting rattled. I'm genuinely curious what makes you think I'm "rattled"? I'm not. On Wed, Nov 6, 2019 at 2:54 PM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. I understood this just fine. I happen to think your argument in this regard is neither sound nor valid.
Question: Where did APL's "←" operator come from? A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's. In many ways it meant "this name is assigned to this object", to paraphrase your statement. I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"? This is no different from the concept of pointers. A pointer stores an address to some data structure somewhere. No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. It's a pointer. We are assigning a pointer. We are storing an address that points to where the data lives. In fact, one could very well make the argument that using the "=" character for this operation is misleading because the left side is not set to be equal to the right side. Even worse, these pointers in Python are inmutable. Someone coming from a whole range of languages sees the "=" sign to mean something very different. For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language. So, "=" in Python is not equal to "=" in many languages. Why are we using the same symbol and creating this confusion? If your response is something like "people learn the difference", well, you just made my point. People learn. I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally. It has been my experience that people who have not had the experience rarely get it, and, sadly, more often than not, they become hostile to the implication that there might actually be a better way to translate ideas into computer executable code. That's just reality and I am not going to change it. Look, we don't have to agree, and, frankly, you seem to be getting rattled. I want no part of that. I didn't come here to change the Python universe. Like I said, I am nobody, so, yeah, forget it. Don't waste your time on me or my ridiculous ideas. I just wanted to share an opinion, worthless as it might be. Thanks, -Martin On Wednesday, November 6, 2019, 12:18:21 PM PST, Abe Dillon <abedillon@gmail.com> wrote: I used APL professionally for about ten years. Yes, you've stated that already. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective. I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective. The fact that people learn and then become comfortable with symbols doesn't imply that choosing which symbols to adopt into a language is trivial. You can follow the evolution of languages over time and find that they often eject characters that serve little use or cause confusion like the english character "thorn". The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them. You don't need to convince people of the power of abstraction or the utility of domain-specific languages. Such a general statement doesn't support the adoption of any specific change. You might as well be advocating for adding Egyptian hieroglyphics to musical notation. We don't need a lecture on the importance of abstract notation each time a new syntax is proposed. The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical. I strongly disagree with this. First of all, mathematical notation which programming borrows heavily from, highly favors compaction over clarity. It uses greek and latin symbols that mean different things depending on the field. It uses both left and right super and sub-scripts sometimes for naming conventions, sometimes to denote exponentiation. It uses dots and hats and expressions that sit below and/or above symbols (like in "limit" notation or summations) and all sorts of other orientations and symbol modifications that are almost impossible to look up, infix and prefix and postfix notation. It makes picking up any given mathematical paper a chore to comprehend because so much context is assumed and not readily accessible. Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old. It predates the printing press and other tools that make more explicit and readable notation possible. It was much more important hundreds of years ago, that your ideas be expressible in a super-concise form to the detriment of readability. That's not the only reason, of course, but it is a pretty big reason. I submit that most mathematical papers would benefit from having their formulas re-written in something like a programming language with more explicit variable names and consistent notation. As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?! That's all I have time for, for now, I may pick this up later. On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote: Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them. The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical. APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas. Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple. The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way. Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it. I encourage everyone to read this Turing Award presentation: "Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf Also, if you haven't seen it, these videos is very much worth watching: Conway's Game of Life in APLhttps://www.youtube.com/watch?v=a9xAKttWgP4 Suduku solver in APLhttps://www.youtube.com/watch?v=DmT80OseAGs -Martin On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil <risa2000x@gmail.com> wrote: On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <python-ideas@python.org> wrote: In other words, these two things would have been equivalent in Python: a ← 23 a = 23 I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python. Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/ _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/CHGSPX... Code of Conduct: http://python.org/psf/codeofconduct/
Nope, not true.
Yes indeed. It is true. Bytes, words, integers and that's about it. All the primary data types and structs and enums and typedefs. That leaves pointers, arrays, and... function pointers if you don't count those as pointers. All of which misses the point completely. In C, you have to declare variables with a type (even if it's a pointer) and the program allocates memory for that variable (even if it's a pointer) to be stored (like a bucket). If it's a basic type or struct, the memory allocated for it depends on the type. The program has to allocate memory *for the variable* based on what that variable is supposed to hold (even if it's a pointer). I really don't want to explain the mechanics of C here. I know how pointers work. You seem to be avoiding the point on purpose. On Wed, Nov 6, 2019 at 6:05 PM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. That's exactly how variables work in many common languages like C
Nope, not true. Bytes, words, integers and that's about it. Anything else is a pointer to a relevant data structure somewhere in memory. I think I can say this it true of the vast majority of languages. Exceptions are cases like Python and, say, Objective-C, or, in general, where the philosophy is that everything is an object. Nobody is filling buckets with anything every time there's an assignment, at least no language I have ever used. At the end of the day, it's a pointer to a chunk-o-memory with a header describing what's in there, how many, etc. In more complex cases it's a pointer to a linked list or a pointer to a chunk of memory filled with pointers to other chunks of memory. This has been the case from almost the beginning of time. Let's put it this way, I was doing this kind of thin when I was programming IMSAI's with toggle switches and keeping track of variable names, memory locations and contents on a notebook by hand, paper and pencil.
-Martin
On Wednesday, November 6, 2019, 02:01:27 PM PST, Abe Dillon < abedillon@gmail.com> wrote:
Question: Where did APL's "←" operator come from?
Doesn't matter. If your notation can't stand on its own without a history lesson, then it's not great.
A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's. In many ways it meant "this name is assigned to this object", to paraphrase your statement.
In "many ways"? Not exactly? How does this make it better? It still sounds counterintuitive. If it really means "this name references this object", why not a left arrow?
I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"?
I would say that it's not perfect notation. I read "a = 23" as "a is 23" in the same way that I would say "Abe is my name". It doesn't describe the relationship well, but it's an acceptable, pragmatic use of familiar notation and it doesn't run *counter to* the actual mechanics of the language. As far as symbols go, an arrow to the left would be closest to representing the mechanics of the language, "=" is a compromise, and "←" is backwards. The object doesn't refer to the name, the name refers to the object.
No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain.
That's exactly how variables work in many common languages like C and is actually a common misconception even among people with years of experience with Python.
Check out this Stack Overflow question <https://softwareengineering.stackexchange.com/questions/314808/why-variables-in-python-are-different-from-other-programming-languages> that asks what the difference is and the confused comments that follow saying there is no difference or even the fact that the question was closed because people didn't understand what it was asking. It's pretty bonkers to me that professional programers don't know the difference or understand a very basic question, but it's the world we live in.
In fact, one could very well make the argument that using the "=" character for this operation is misleading because the left side is not set to be equal to the right side. Even worse, these pointers in Python are inmutable.
The "=" doesn't imply immutability. That's actually what the walrus operator ":=" implies. "pi := circumference/diameter" means "pi is defined as the ratio of the circumference to the diameter".
For the most part, implying that the name equals the object it references is fine. Implying that you're somehow putting the object into the variable or attaching the object to the variable (instead of the other way around) is backwards.
Someone coming from a whole range of languages sees the "=" sign to mean something very different. For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language. So, "=" in Python is not equal to "=" in many languages. Why are we using the same symbol and creating this confusion?
For many many reasons.
If your response is something like "people learn the difference", well, you just made my point. People learn.
That's not sufficient justification. If it were, you could use that same logic to justify adding any symbol to any language. I find musical notation woefully lacking. There's no way to denote clapping or snapping fingers or the glottal stops that Regina Spector is so fond of. Maybe I should add a fish symbol, a duck symbol, and the guy walking sideways from Egyptian Hieroglyphics to the standard musical notation to represent those sounds. People will learn the difference, right?
I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally.
Oh, really? You programmed APL for 10 years?! Did you go to Yale Mr. Kavanaugh? You can cut the arguments from Authority. They're worth nothing.
Look, we don't have to agree, and, frankly, you seem to be getting rattled.
I'm genuinely curious what makes you think I'm "rattled"? I'm not.
On Wed, Nov 6, 2019 at 2:54 PM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python.
I understood this just fine. I happen to think your argument in this regard is neither sound nor valid.
Question: Where did APL's "←" operator come from?
A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's. In many ways it meant "this name is assigned to this object", to paraphrase your statement.
I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"?
This is no different from the concept of pointers. A pointer stores an address to some data structure somewhere. No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. It's a pointer. We are assigning a pointer. We are storing an address that points to where the data lives.
In fact, one could very well make the argument that using the "=" character for this operation is misleading because the left side is not set to be equal to the right side. Even worse, these pointers in Python are inmutable. Someone coming from a whole range of languages sees the "=" sign to mean something very different. For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language. So, "=" in Python is not equal to "=" in many languages. Why are we using the same symbol and creating this confusion?
If your response is something like "people learn the difference", well, you just made my point. People learn.
I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally. It has been my experience that people who have not had the experience rarely get it, and, sadly, more often than not, they become hostile to the implication that there might actually be a better way to translate ideas into computer executable code. That's just reality and I am not going to change it.
Look, we don't have to agree, and, frankly, you seem to be getting rattled. I want no part of that. I didn't come here to change the Python universe. Like I said, I am nobody, so, yeah, forget it. Don't waste your time on me or my ridiculous ideas. I just wanted to share an opinion, worthless as it might be.
Thanks,
-Martin
On Wednesday, November 6, 2019, 12:18:21 PM PST, Abe Dillon < abedillon@gmail.com> wrote:
I used APL professionally for about ten years.
Yes, you've stated that already.
None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics.
A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The fact that people learn and then become comfortable with symbols doesn't imply that choosing which symbols to adopt into a language is trivial. You can follow the evolution of languages over time and find that they often eject characters that serve little use or cause confusion like the english character "thorn" <https://en.wikipedia.org/wiki/Thorn_(letter)>.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
You don't need to convince people of the power of abstraction or the utility of domain-specific languages. Such a general statement doesn't support the adoption of any specific change. You might as well be advocating for adding Egyptian hieroglyphics to musical notation. We don't need a lecture on the importance of abstract notation each time a new syntax is proposed.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
I strongly disagree with this. First of all, mathematical notation which programming borrows heavily from, highly favors compaction over clarity. It uses greek and latin symbols that mean different things depending on the field. It uses both left and right super and sub-scripts sometimes for naming conventions, sometimes to denote exponentiation. It uses dots and hats and expressions that sit below and/or above symbols (like in "limit" notation or summations) and all sorts of other orientations and symbol modifications that are almost impossible to look up, infix and prefix and postfix notation. It makes picking up any given mathematical paper a chore to comprehend because so much context is assumed and not readily accessible.
Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old. It predates the printing press and other tools that make more explicit and readable notation possible. It was much more important hundreds of years ago, that your ideas be expressible in a super-concise form to the detriment of readability. That's not the only reason, of course, but it is a pretty big reason. I submit that most mathematical papers would benefit from having their formulas re-written in something like a programming language with more explicit variable names and consistent notation.
As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?!
That's all I have time for, for now, I may pick this up later.
On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value
I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas.
Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter " ←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.
Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
I encourage everyone to read this Turing Award presentation:
"Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
Also, if you haven't seen it, these videos is very much worth watching:
Conway's Game of Life in APL https://www.youtube.com/watch?v=a9xAKttWgP4
Suduku solver in APL https://www.youtube.com/watch?v=DmT80OseAGs
-Martin
On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil < risa2000x@gmail.com> wrote:
On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas < python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:
a ← 23
a = 23
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/
_______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/CHGSPX... Code of Conduct: http://python.org/psf/codeofconduct/
_______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/BLMGMI... Code of Conduct: http://python.org/psf/codeofconduct/
On Nov 7, 2019, at 01:04, Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain. That's exactly how variables work in many common languages like C
Nope, not true. Bytes, words, integers and that's about it. Anything else is a pointer to a relevant data structure somewhere in memory. I think I can say this it true of the vast majority of languages. Exceptions are cases like Python and, say, Objective-C, or, in general, where the philosophy is that everything is an object. Nobody is filling buckets with anything every time there's an assignment, at least no language I have ever used.
Almost every part of this is seriously wrong. In C, filling a struct-shaped bucket with a struct-shaped value is exactly what happens when you initialize, assign to, or pass an argument to a struct-typed variable. Early C severely restricted when you could do such things, but that’s always how it worked when it was allowed, and in modern C it’s allowed almost everywhere you’d want it to be. If you want to pass around a pointer in C, you have to be explicit on both ends—use a pointer-to-struct-typed variable, and use the & operator on the value (and then you have to use the * operator to do anything with the pointer variable). And the right way to think about it is not reference semantics, but filling a pointer-shaped bucket with a pointer-shaped value. (Or, actually, not just “pointer” but specifically “pointer to some specific type”; you can’t portably stick a pointer-to-nullary-int-function value in a pointer-to-double value.) And this is key to the whole notion of lvalue semantics, where variables (among other things) are typed buckets with identity for storing values (that are just bit patterns), as opposed to namespace semantics, where variables (among other things) are just names for typed values with identity that live wherever they want to live. After `a=b` in a namespace language, `a is b` is true, because a and b are two names for the one value in one location; in an lvalue language, there usually is no such operator, but if there were. `a is b` would still be false, because a and b are two distinct locations that contain distinct copies of the same value, because what `a=b` means is filling the a bucket with the value in the b bucket. (In fact, in C., even `a==b` might not be true if, say, `a` is an uint16 variable and b is a uint32 variable holding a value over 65535.) And it’s not like lvalue semantics was a dead-end bad idea from C that other languages abandoned. C++ took lvalue semantics much further than C, and built on struct assignment with notions like assignment operators, that let your type T customize how to fill a T-shaped bucket with whatever kind of value you want. C# and Swift built even further on that by syntactically distinguishing value types (that act like C structs) and reference types (that act sort of like Python objects). Meanwhile, while most “everything is an object” languages like Python, Smalltalk, and Ruby (but not Objective C, which is about as far from everything-is-an-object as possible) use namespace semantics, they’re hardly the only languages that do; so do, for example, plenty of impure functional languages that aren’t at all OO. Meanwhile, “bytes, words, integers, and… anything else is a [implicit] pointer” is a kind of clunky manual optimization used in various 80s languages, and borrowed from there into Java, but much less common in newer languages. It’s hardly a universal across languages, much less some kind of a priori necessity.
On Nov 6, 2019, at 21:53, Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally. It has been my experience that people who have not had the experience rarely get it, and, sadly, more often than not, they become hostile to the implication that there might actually be a better way to translate ideas into computer executable code.
I haven’t used APL professionally, but I’ve heard Iverson talk about it, and read articles that are supposed to sell me on it, and I think I get the point of it. And the point is that it gets you about halfway to where Haskell does and then leaves you there. In traditional languages you have to write loops and then turn your functions inside out. In Python/R/Mathematica/C++/etc. you can do things array-wise, but only the things the language/library designer thought of. (Well, you can use vectorize/frompyfunc/etc., which does have the right semantics, but it doesn’t look as friendly as the stuff NumPy comes with, and it’s slow.) GLSL has different limitations, and you often do have to think about the parallel “virtual loops” to do anything nontrivial. APL doesn’t have either of those problems; you can—and are actively encouraged to—think about how to combine operations independently from how to apply the combined operation. But Haskell encourages that too—and further abstraction beyond it. Of course like J, its operators are strings of ASCII symbols (or identifiers in backticks), but that doesn’t change what the abstractions are, just how they’re spelled. For example, it’s cool that in APL I can lift + to sum just by writing +/, so I can sum an array of ints with +/x. But what if x is an array of bigints or fractions or decimal64s? As far as I know, there’s no way to implement such things in a way that + works on them. I know there’s a box operator that acts like an array by reading from stdin, but can I write something similar that reads from an open text file, parses it as CSV, parses the third column of each row as an int, and pass those ints to +/ too? What if I want to sum an array of maybe ints, or a maybe array of ints, and get back a maybe int? What if I want to pass around + or +/ or / as first-class objects? Can I even apply an operator to an operator, or do I only get second-order rather than arbitrarily-higher-order functionality? Can I write a new operator that unlifts and relifts any function in a way that turns this one from sum to running-sums? Is there something about all of those limitations that crystallizes your thinking differently than having a more abstract and less restricted version of the same abstractions? That’s not inconceivable, but it doesn’t seem likely a priori, and I’ve never heard an argument for it, and none of the examples APL fans have shown me have convinced me. And here’s the thing: sometimes Haskell is the right tool for the job, and even when it isn’t, learning it made me a better programmer even when I’m using other languages—but that doesn’t mean I want Python to be more like Haskell. (Well, maybe occasionally in a few minor ways, but I don’t want Python to be static-type-driven, or to be pure immutable, or to curry all functions and encourage point-free style, or to be lazy and make me trust an optimizer to figure out when to reify values, or to let me define hundreds of arbitrary operators, etc.) So, what’s different about APL that you want Python to be more like APL?
On Thu, Nov 7, 2019 at 7:21 AM Abe Dillon <abedillon@gmail.com> wrote:
None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics.
For the record, I actually don't think that this is much of a problem. We have notions of "assignment" and "equality" that don't always have the exact same meanings in all contexts, and our brains cope. The sense in which 2/3 is equal to 4/6 is not the same as the one in which x is equal to 7, and whether or not you accept that the (infinite) sum of all powers of two is "equal to" -1 depends on your interpretation of equality. You might think that after an assignment, the variable is equal to that value - but you can say "x = x + 1" and then x will not be equal to x + 1, because assignment is temporal in nature. People will figure this out regardless of the exact spelling of either equality or assignment.
As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?!
Agreed. I would suggest, though, that this isn't going to be anything new. It's just a progression that we're already seeing - that programming languages are becoming more abstract, more distant from the bare metal of execution. Imagine a future in which we dictate to a computer what we want it to do, and then it figures out (via AI) how to do it... and now imagine what a present day C compiler does to figure out what sequence of machine language instructions will achieve the programmer's stated intention. Here's a great talk discussing the nature of JavaScript in this way: https://www.destroyallsoftware.com/talks/the-birth-and-death-of-javascript AI/ML might well be employed *already* to implement certain optimizations. I wouldn't even know; all I know is that, when I ask the computer to do something, it does it. That's really all that matters! ChrisA
Abe Dillon wrote:
Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old.
No, it's because infix notation is *more readable* than function notation when formulas become complex. It has persisted because it works, not because mathematicians are stuck in their ways. Having said that, it's relatively rare that mathematicians make up entirely new symbols -- they're more likely to repurpose existing ones. E.g. "+" is used for addition-like operations on a very wide variety of things -- numbers, vectors, matrices, tensors, quantum states, etc. etc. Mathematics is quite Python-like in that way. -- Greg
Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old. No, it's because infix notation is *more readable* than function notation when formulas become complex.
I think addition was a bad example of inconsistent notation, but I did hedge that statement with: That's not the only reason, of course, but it is a pretty big reason. I don't disagree that infix notation is more readable because humans have trouble balancing brackets visually. However, I maintain that readability doesn't seem to be the main concern of math notation. The main concern of math notation seems to be limiting ink or chalk use at the expense of nearly all else (especially readability). Why is exponentiation or log not infixed? Why so many different ways to represent division or differentiation? It has persisted because it works, not because mathematicians are stuck in
their ways.
Something persisting because it works does not imply any sort of optimality. A good way to test this is to find a paper with heavy use of esoteric math notation and translate that notation to code. I think you'll find the code more accessible. I think you'll find that even though it takes up significantly more characters, it reads much quicker than a dense array of symbols. I spent a good few weeks trying to make sense of the rather short book "Universal Artificial Intelligence" by Marcus Hutter because he relies so heavily on symbolic notation. Now that I grasp it, I could explain it much more clearly in much less time to someone with much less background than I had going in to the book. On Thu, Nov 7, 2019 at 1:44 AM Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Abe Dillon wrote:
Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation? Because math is old.
No, it's because infix notation is *more readable* than function notation when formulas become complex. It has persisted because it works, not because mathematicians are stuck in their ways.
Having said that, it's relatively rare that mathematicians make up entirely new symbols -- they're more likely to repurpose existing ones. E.g. "+" is used for addition-like operations on a very wide variety of things -- numbers, vectors, matrices, tensors, quantum states, etc. etc. Mathematics is quite Python-like in that way.
-- Greg _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/FUHAOC... Code of Conduct: http://python.org/psf/codeofconduct/
Abe Dillon wrote:
I don't disagree that infix notation is more readable because humans have trouble balancing brackets visually.
I don't think it's just about brackets, it's more about keeping related things together. An expression such as b**2 - 4*a*c can be written unambiguously without brackets in a variety of less-readable ways, e.g. - ** b 2 * 4 * a c That uses the same number of tokens, but I think most people would agree that it's a lot harder to read.
The main concern of math notation seems to be limiting ink or chalk use at the expense of nearly all else (especially readability).
Citation needed, that's not obvious to me at all. Used judiciously, compactness aids readability because it allows you to see more at once. I say "judiciously" because it's possible to take it too far -- regexes are a glaring example of that. Somewhere there's a sweet spot where you present enough information in a well-organised way in a small enough space to be able to see it all at once, but not so much that it becomes overwhelming.
Why is exponentiation or log not infixed? Why so many different ways to represent division or differentiation?
Probably the answer is mostly "historical reasons", but I think there are good reasons for some of those things persisting. It's useful to have addition, multiplication and exponentiation all look different from each other. During my Lisp phase, what bothered me more than the parentheses (I didn't really mind those) was that everything looked so bland and uniform -- there were no visual landmarks for the eye to latch onto. Log not infix -- it's pretty rare to use more than one base for logs in a given expression, so it makes sense for the base to be implied or relegated to a less intrusive position. I can only think of a couple of ways mathematicians represent division (÷ is only used in primary school in my experience) and one of them (negative powers) is kind of unavoidable once you generalise exponentiation beyond positive integers. I'll grant that there are probably more ways to represent differentiation than we really need. But I think we would lose something if we were restricted to just one. Newton's notation is great for when you're thinking of functions as first-class objects. But Leibnitz makes the chain rule blindingly obvious, and when you're solving differential equations by separating variables it's really useful to be able to move differentials around independently. Other notations have their own advantages in their own niches.
Something persisting because it works does not imply any sort of optimality.
True, but equally, something having been around for a long time doesn't automatically mean it's out of date and needs to be replaced. Things need to be judged on their merits.
A good way to test this is to find a paper with heavy use of esoteric math notation and translate that notation to code. I think you'll find the code more accessible. I think you'll find that even though it takes up significantly more characters, it reads much quicker than a dense array of symbols.
In my experience, mathematics texts are easiest to read when the equations are interspersed with a good amount of explanatory prose, using words rather than abbreviations. But I wouldn't want the equations themselves to be written more verbosely. Nor would I want the prose to be written in a programming language. Perhaps ironically, the informality of the prose makes it easier to take in.
I spent a good few weeks trying to make sense of the rather short book "Universal Artificial Intelligence" by Marcus Hutter because he relies so heavily on symbolic notation. Now that I grasp it, I could explain it much more clearly in much less time to someone with much less background than I had going in to the book.
But your explanation would be in English, not a formal language, right? -- Greg
On 2019-11-06 12:05 p.m., Martin Euredjian via Python-ideas wrote:
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
I'm not nearly qualified enough to speak to the rest of your message, but I should point out that I still have no idea how to type that arrow. The keystroke for [ already includes the Alt key and does nothing in NARS Alex
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup.
This distinction between notation and soup seems pretty subjective. What is the difference between soup and notation? In my mind it has a lot to do with familiarity. I watched that video about programming Conway's Game of Life in APL and it looks like an incomprehensible soup of symbols to me. Another example of ASCII soup is regex. That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read and I have to crack open a reference because I can never remember the notation. Luckily someone came up with a solution called verbal expressions <https://github.com/VerbalExpressions/PythonVerbalExpressions> which trade hard-to-remember symbols with easy to understand words! (though I think the Python implementation smacks of Java idioms) I'm sure there are people who work with regular expressions on such a regular basis that they've become fluent, but when you require such deep emersion in the language before the symbols make sense to you, it's a huge barrier to entry. You can't then act all confused about why your favorite language never caught on. Without real notation one introduces a huge cognitive load. What is "real notation". This sounds like a no-true-Scotsman fallacy <https://en.wikipedia.org/wiki/No_true_Scotsman>. Has everyone on this message board been communicating with fake ASCII notation this entire time? Cognative load can come from many different places like: 1. Having to remember complex key combinations just to get your thoughts into code 2. Having to memorize what each of thousands of symbols do because there's no way to look them up in a search engine 3. Knowing no other notation system that even slightly resembles APL. I mean, I know some esoteric mathematics, but I've never seen anything that looks even remotely like: life←{↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵} A big part of Python's philosophy is that you read code way more often than you write code so we should optimize readability. As one Reddit commentor put it <https://www.reddit.com/r/programming/comments/1vlbhx/sudoku_solving_program_in_apl_that_blows_my_mind/> : APL is such a powerful language. APL is also a powerfully write-only
language.
And I don't even fully agree there because it somehow manages to be almost as difficult to write. Typing these symbols isn't a problem at all. For example, in NARS2000, a
free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again.
For some people. I, myself; have a learning disability and often need to look at my keyboard. The relationship between "←" and "[" doesn't seem obvious at all. If you download NARS2000 right now you will know how to enter "←"
immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
You know what's even simpler and requires even less cognitive load? Typing ASCII characters... The other interesting thing about notation is that it transcends language. The word "notation" refers to symbols, abbreviations, and short-hand that make up domain-specific languages. Nothing about notation "transcends" language, notation is a component of language. Math is the study of patterns. Mathematical notation is what we use to write the language of patterns, to describe different patterns and communicate ideas about patterns. There used to be different mathematical languages based on culture, just like spoken languages. There's no magical property that made Roman numerals or Arabic numerals just make sense to people from other cultures, they had to learn each others notation just like any other language and eventually settled on Arabic numerals. Maybe things would have gone differently if the Mayans had a say. It has been my experience that people who have not had the experience
rarely get it
A pattern I've seen in my experience is that some person or group will put forth a pretty good idea, and others become dogmatic about that idea, loose sight of pragmatism, and try to push the idea beyond its practical applicability. I'm not saying this is you. I haven't yet read the Ken Iverson paper (I will). My suspicion at this point and after seeing the APL code demos is that there's probably plenty of good ideas in there, but APL doesn't strike me as pragmatic in any sense. On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value
I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas.
Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter " ←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.
Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
I encourage everyone to read this Turing Award presentation:
"Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
Also, if you haven't seen it, these videos is very much worth watching:
Conway's Game of Life in APL https://www.youtube.com/watch?v=a9xAKttWgP4
Suduku solver in APL https://www.youtube.com/watch?v=DmT80OseAGs
-Martin
On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil < risa2000x@gmail.com> wrote:
On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas < python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:
a ← 23
a = 23
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/
This distinction between notation and soup seems pretty subjective. Yes and no. Math looks like hieroglyphics to most people. We are talking about professional programmers here. In that context something like the Conway Game of Life in APL demo should inspire an interested party in exploring further. None of the tools he uses in the demo are difficult to comprehend, particularly if you have a background in basic Linear Algebra (another foundational element of APL).
It's like learning a language that does not use latin script, say, Hebrew or Greek. At first nothing makes sense, yet it doesn't take very long for someone to recognize the characters, attach them to sounds and then make words, badly at first and better with time. Note that I am not proposing a complete APL-ization of Python. My only observation was that the judicious introduction of a single new symbol for assignment would solve the problem that now requires "=" and ":=". This is far more elegant. You don't break old code --ever-- even when you phase out the use of "=" in future versions because replacing "=" with the new symbol is an elementary process.
Another example of ASCII soup is regex.> That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read That's it! You got it! The difference is that regex looks like )(*&)(*&)(*^)(*&^ which means nothing. Your brain has a mapping for what these symbols mean already. Ascribing new meaning to a new mish-mash of them breaks that mental mapping and model, which means that it requires 50 or 100 times the cognitive load to process and comprehend. You are forced to keep a truly unnatural mental stack in your head as you parse these infernal combinations of seemingly random ASCII to figure out their meaning. Notation changes that, if anything for one simple reason: It establishes new patterns, with punctuation and rhythm and your brain can grok that. Don't forget that our brains have evolved amazing pattern matching capabilities, symbols, notation, take advantage of that, hence the deep and wide history of humanity using symbols to communicate. Symbols are everywhere, from the icons on your computer and phone to the dashboard of your car, signs on the road, math, music, etc.
What is "real notation" Maybe the right term is "domain specific notation". I suspect you know very well what I mean and are simply enjoying giving me a hard time. No problem. Thick skin on this side. APL is such a powerful language. APL is also a powerfully write-only language. Easy answer: That Reddit commenter is simply ignorant. This is silly.
APL doesn't strike me as pragmatic in any sense.
Look, APL is, for all intents an purposes, a dead for general usage today. Yet both IBM and Dyalog sell high end interpreters, with Dyalog getting $2,500 PER YEAR (I believe IBM is similarly priced). https://www.dyalog.com/prices-and-licences.htm#devlicprice https://www.ibm.com/us-en/marketplace/apl2 So, clearly this would not exist if the language was useless or if it was "write-only" as that genius on Reddit opined. That said, outside of certain application domains I would not recommend anyone consider using APL. The language, as I said before, was ahead of it's time and the people behind it truly sucked at marketing and expanding popularity for more reasons than I care to recount here. I was very involved in this community in the 80's. I knew Ken Iverson and the other top fliers in the domain. I even published a paper back in '85 along with a presentation at an ACM/APL conference. And still, I would say, no, not something anyone should use today. Learn? Yes, absolutely, definitely. It's an eye opener but not much more than that. Real APL development stopped a long time ago. Maybe one day someone with the right ideas will invent NextGenerationAPL or something like that and give it a place in computing. That is not to say that some of the concepts in APL have no place in other languages or computing. For example, list comprehensions in Python have a very close link to the way things are done in APL. They almost feel APL-like constructs to someone with experience in the language. Here's another interesting APL resource that serious practitioners have used for decades (with some memorizing useful idioms): https://aplwiki.com/FinnAplIdiomLibrary Yes, if you program APL professionally you can read this and it does not look like ASCII soup. For example this is a downward (largest to smallest) sort of a: b←⍒a a ← ⍳20 a1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 b←⍒a b20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 I just showed you another symbol, "⍳", the index generator, which is loosely equivalent to range() in Python. So "⍳20" and range(1,21) generate similar results. In APL parlance, it's a vector. The difference is that I can then shape this vector into multidimensional arrays, for example, a 5x5 matrix: 5 5 ⍴ a 1 2 3 4 5 6 7 8 9 1011 12 13 14 1516 17 18 19 20 1 2 3 4 5 That's a new operator "⍴" or "rho" the "reshape" operator. In this case it takes vector a and reshapes it into a 5x5 matrix (replicating data where needed. Of course, I can assign this matrix to a new variable if I want to: c ← 5 5 ⍴ a And now, if I want to sum the values across each row (horizontally) I simply do this: +/c15 40 65 90 15 Or vertically: +⌿c35 40 45 50 55 You can just as easily reshape a into a three dimensional array (two matrices, a lamination of matrices conceptually, a stack of cards where each matrix is written on each card): 2 3 4 ⍴ a 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1617 18 19 20 1 2 3 4 And, of course, you can sum the rows of this structure, which in APL is known as a "tensor": +/2 3 4 ⍴ a10 26 4258 74 10 This time without assigning to an intermediate variable, APL executes from right to left unless there are parenthesis. So, I could do the entire thing in one shot. Generate a sequence of numbers from 1 to N, reshape them into a tensor, sum across each row and then take that result and sum across each column. Here it is, with N = 200: +⌿ +/ 2 3 4 ⍴ ⍳ 200 +⌿ +/ 2 3 4 ⍴ ⍳ 20068 100 132 Of course, there are other things you can do. For example, take a 5 x 5 reshape (a matrix) of the first 25 numbers and then transpose the matrix (swap rows and columns): 5 5 ⍴ ⍳ 25 1 2 3 4 5 6 7 8 9 1011 12 13 14 1516 17 18 19 2021 22 23 24 25 ⍉ 5 5 ⍴ ⍳ 251 6 11 16 212 7 12 17 223 8 13 18 234 9 14 19 245 10 15 20 25 That introduces a new symbol, transpose. I suspect that if you read through what I wrote above you already understand what the first statement did. It really is that simple. 15 > ⍉ 5 5 ⍴ ⍳ 251 1 1 0 01 1 1 0 01 1 1 0 01 1 1 0 01 1 0 0 0 It returns a new matrix with a 1 anywhere the desired condition is met. The utility of this is easier to see if I use a variable stuffed with random numbers: a ← 25?100 a90 6 10 99 62 15 52 32 98 13 7 100 64 58 16 29 67 56 53 28 96 27 59 30 18 a is a vector of 25 elements with each element being a randomly chosen value between 1 and 100. We can reshape this into anything we want, but for simplicity sake I'll leave it as a vector. I want to find and extract values less than, say, 25. This first expression generates a new boolean vector with a 1 anywhere the condition is met: a < 250 1 1 0 0 1 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 To extract the values I can just use the above vector to "reduce" the original vector, like this: (a < 25)/a6 10 15 13 7 16 18 And If I want I can use reduction-product to multiply them all: ×/ (a < 25)/a23587200 Anyhow, that's a microscopic taste of APL with a few very simple examples. I would venture a guess that, if you actually followed the examples, you got comfortable with the notation and nomenclature I introduced. There's a lot more, of course, and this is how one learns it, incrementally. BTW, my last large APL project was a the software used to run a high speed DNA sequencing machine researchers used during the race to decode the human genome. Thanks, -Martin On Wednesday, November 6, 2019, 04:12:19 PM PST, Abe Dillon <abedillon@gmail.com> wrote: No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. This distinction between notation and soup seems pretty subjective. What is the difference between soup and notation? In my mind it has a lot to do with familiarity. I watched that video about programming Conway's Game of Life in APL and it looks like an incomprehensible soup of symbols to me. Another example of ASCII soup is regex. That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read and I have to crack open a reference because I can never remember the notation. Luckily someone came up with a solution called verbal expressions which trade hard-to-remember symbols with easy to understand words! (though I think the Python implementation smacks of Java idioms) I'm sure there are people who work with regular expressions on such a regular basis that they've become fluent, but when you require such deep emersion in the language before the symbols make sense to you, it's a huge barrier to entry. You can't then act all confused about why your favorite language never caught on. Without real notation one introduces a huge cognitive load. What is "real notation". This sounds like a no-true-Scotsman fallacy. Has everyone on this message board been communicating with fake ASCII notation this entire time? Cognative load can come from many different places like: - Having to remember complex key combinations just to get your thoughts into code - Having to memorize what each of thousands of symbols do because there's no way to look them up in a search engine - Knowing no other notation system that even slightly resembles APL. I mean, I know some esoteric mathematics, but I've never seen anything that looks even remotely like: life←{↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵} A big part of Python's philosophy is that you read code way more often than you write code so we should optimize readability. As one Reddit commentor put it: APL is such a powerful language. APL is also a powerfully write-only language. And I don't even fully agree there because it somehow manages to be almost as difficult to write. Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. For some people. I, myself; have a learning disability and often need to look at my keyboard. The relationship between "←" and "[" doesn't seem obvious at all. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple. You know what's even simpler and requires even less cognitive load? Typing ASCII characters... The other interesting thing about notation is that it transcends language. The word "notation" refers to symbols, abbreviations, and short-hand that make up domain-specific languages. Nothing about notation "transcends" language, notation is a component of language. Math is the study of patterns. Mathematical notation is what we use to write the language of patterns, to describe different patterns and communicate ideas about patterns. There used to be different mathematical languages based on culture, just like spoken languages. There's no magical property that made Roman numerals or Arabic numerals just make sense to people from other cultures, they had to learn each others notation just like any other language and eventually settled on Arabic numerals. Maybe things would have gone differently if the Mayans had a say. It has been my experience that people who have not had the experience rarely get it A pattern I've seen in my experience is that some person or group will put forth a pretty good idea, and others become dogmatic about that idea, loose sight of pragmatism, and try to push the idea beyond its practical applicability. I'm not saying this is you. I haven't yet read the Ken Iverson paper (I will). My suspicion at this point and after seeing the APL code demos is that there's probably plenty of good ideas in there, but APL doesn't strike me as pragmatic in any sense. On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote: Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them. The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical. APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas. Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple. The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way. Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it. I encourage everyone to read this Turing Award presentation: "Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf Also, if you haven't seen it, these videos is very much worth watching: Conway's Game of Life in APLhttps://www.youtube.com/watch?v=a9xAKttWgP4 Suduku solver in APLhttps://www.youtube.com/watch?v=DmT80OseAGs -Martin On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil <risa2000x@gmail.com> wrote: On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <python-ideas@python.org> wrote: In other words, these two things would have been equivalent in Python: a ← 23 a = 23 I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python. Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax? Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/
On Thu, Nov 7, 2019 at 12:35 PM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Another example of ASCII soup is regex. That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read
That's it! You got it! The difference is that regex looks like )(*&)(*&)(*^)(*&^ which means nothing. Your brain has a mapping for what these symbols mean already. Ascribing new meaning to a new mish-mash of them breaks that mental mapping and model, which means that it requires 50 or 100 times the cognitive load to process and comprehend. You are forced to keep a truly unnatural mental stack in your head as you parse these infernal combinations of seemingly random ASCII to figure out their meaning.
Notation changes that, if anything for one simple reason: It establishes new patterns, with punctuation and rhythm and your brain can grok that. Don't forget that our brains have evolved amazing pattern matching capabilities, symbols, notation, take advantage of that, hence the deep and wide history of humanity using symbols to communicate. Symbols are everywhere, from the icons on your computer and phone to the dashboard of your car, signs on the road, math, music, etc.
The asterisk is commonly interpreted to mean multiplication:
3 * 5 15 "abc" * 4 'abcabcabcabc'
In a regex, it has broadly the same meaning. It allows any number of what came before it. That's broadly similar to multiplication. How is that somehow "not notation", yet you can define arbitrary symbols to have arbitrary meanings and it is "notation"? What's the distinction? ChrisA
In that context something like the Conway Game of Life in APL demo should inspire an interested party in exploring further. None of the tools he uses in the demo are difficult to comprehend, particularly if you have a background in basic Linear Algebra (another foundational element of APL).
My reaction was "burn it with fire!". It looks like a programming language designed around coding golf <https://en.wikipedia.org/wiki/Code_golf>. I've seen enough of code in that vein to hate it with a passion. I could tell that you thought the game of life video would win some favor, I can only speak for my self; but I suspect it's had the opposite effect intended. The difference is that regex looks like )(*&)(*&)(*^)(*&^ which means
nothing. Your brain has a mapping for what these symbols mean already. Ascribing new meaning to a new mish-mash of them breaks that mental mapping and model
I don't think my brain has much of a model for what * and ^ mean. I don't think it's an over-loading problem. Again verbal expressions make much more sense to me even though they use letters that are used everywhere. Even though they require ~10x the number of characters, I can read verbal expressions 50x faster.
What is "real notation" Maybe the right term is "domain specific notation". I suspect you know very well what I mean and are simply enjoying giving me a hard time.
No. It was a 100% legit question. I think our perspectives are far out of sync if you think that was a facetious question.
APL is such a powerful language. APL is also a powerfully write-only language. Easy answer: That Reddit commenter is simply ignorant. This is silly.
It's the same impression I got from watching the videos you linked and trying to learn a bit of APL. From looking around it doesn't seem like an unpopular view. You can call everyone ignorant all you want. Python programmers could just as easily ignore all the people that talk about how slow Python is. It would be a detriment to the community to stick their head in the sand and ignore detractors. If it makes sense to you and a tiny fraction of the world and looks like complete madness to everyone else, then maybe your appraisal of it being super easy to understand and pick up and read and write isn't universal. I think you mentioned having a physics background? I could see how a language that adopts a bunch of super-essoteric mathematical notation would appeal to you. I'm sure the Navier-Stokes equations map quite nicely to APL. When I see that upside-down capital "L" symbol, my brain NOPEs out. Look, APL is, for all intents an purposes, a dead for general usage today.
Yet both IBM and Dyalog sell high end interpreters, with Dyalog getting $2,500 PER YEAR (I believe IBM is similarly priced). https://www.dyalog.com/prices-and-licences.htm#devlicprice https://www.ibm.com/us-en/marketplace/apl2 So, clearly this would not exist if the language was useless or if it was "write-only" as that genius on Reddit opined.
It absolutely would if some of your code base relies on some indecipherable APL that someone wrote back in the 70s and nobody wants to touch out of fear that a nuclear sub will sink somewhere. Otherwise, someone would have translated that bit of code decades ago and been done with it. On Wed, Nov 6, 2019 at 7:33 PM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
This distinction between notation and soup seems pretty subjective.
Yes and no. Math looks like hieroglyphics to most people. We are talking about professional programmers here. In that context something like the Conway Game of Life in APL demo should inspire an interested party in exploring further. None of the tools he uses in the demo are difficult to comprehend, particularly if you have a background in basic Linear Algebra (another foundational element of APL).
It's like learning a language that does not use latin script, say, Hebrew or Greek. At first nothing makes sense, yet it doesn't take very long for someone to recognize the characters, attach them to sounds and then make words, badly at first and better with time.
Note that I am not proposing a complete APL-ization of Python. My only observation was that the judicious introduction of a single new symbol for assignment would solve the problem that now requires "=" and ":=". This is far more elegant. You don't break old code --ever-- even when you phase out the use of "=" in future versions because replacing "=" with the new symbol is an elementary process.
Another example of ASCII soup is regex. That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read
That's it! You got it! The difference is that regex looks like )(*&)(*&)(*^)(*&^ which means nothing. Your brain has a mapping for what these symbols mean already. Ascribing new meaning to a new mish-mash of them breaks that mental mapping and model, which means that it requires 50 or 100 times the cognitive load to process and comprehend. You are forced to keep a truly unnatural mental stack in your head as you parse these infernal combinations of seemingly random ASCII to figure out their meaning.
Notation changes that, if anything for one simple reason: It establishes new patterns, with punctuation and rhythm and your brain can grok that. Don't forget that our brains have evolved amazing pattern matching capabilities, symbols, notation, take advantage of that, hence the deep and wide history of humanity using symbols to communicate. Symbols are everywhere, from the icons on your computer and phone to the dashboard of your car, signs on the road, math, music, etc.
What is "real notation"
Maybe the right term is "domain specific notation". I suspect you know very well what I mean and are simply enjoying giving me a hard time. No problem. Thick skin on this side.
APL is such a powerful language. APL is also a powerfully write-only language.
Easy answer: That Reddit commenter is simply ignorant. This is silly.
APL doesn't strike me as pragmatic in any sense.
Look, APL is, for all intents an purposes, a dead for general usage today. Yet both IBM and Dyalog sell high end interpreters, with Dyalog getting $2,500 PER YEAR (I believe IBM is similarly priced).
https://www.dyalog.com/prices-and-licences.htm#devlicprice
https://www.ibm.com/us-en/marketplace/apl2
So, clearly this would not exist if the language was useless or if it was "write-only" as that genius on Reddit opined.
That said, outside of certain application domains I would not recommend anyone consider using APL. The language, as I said before, was ahead of it's time and the people behind it truly sucked at marketing and expanding popularity for more reasons than I care to recount here. I was very involved in this community in the 80's. I knew Ken Iverson and the other top fliers in the domain. I even published a paper back in '85 along with a presentation at an ACM/APL conference. And still, I would say, no, not something anyone should use today. Learn? Yes, absolutely, definitely. It's an eye opener but not much more than that.
Real APL development stopped a long time ago. Maybe one day someone with the right ideas will invent NextGenerationAPL or something like that and give it a place in computing.
That is not to say that some of the concepts in APL have no place in other languages or computing. For example, list comprehensions in Python have a very close link to the way things are done in APL. They almost feel APL-like constructs to someone with experience in the language.
Here's another interesting APL resource that serious practitioners have used for decades (with some memorizing useful idioms):
https://aplwiki.com/FinnAplIdiomLibrary
Yes, if you program APL professionally you can read this and it does not look like ASCII soup.
For example this is a downward (largest to smallest) sort of a:
b←⍒a
a ← ⍳20 a 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 b←⍒a b 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
I just showed you another symbol, "⍳", the index generator, which is loosely equivalent to range() in Python.
So "⍳20" and range(1,21) generate similar results. In APL parlance, it's a vector.
The difference is that I can then shape this vector into multidimensional arrays, for example, a 5x5 matrix:
5 5 ⍴ a 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 1 2 3 4 5
That's a new operator "⍴" or "rho" the "reshape" operator. In this case it takes vector a and reshapes it into a 5x5 matrix (replicating data where needed.
Of course, I can assign this matrix to a new variable if I want to:
c ← 5 5 ⍴ a
And now, if I want to sum the values across each row (horizontally) I simply do this:
+/c 15 40 65 90 15
Or vertically:
+⌿c 35 40 45 50 55
You can just as easily reshape a into a three dimensional array (two matrices, a lamination of matrices conceptually, a stack of cards where each matrix is written on each card):
2 3 4 ⍴ a 1 2 3 4 5 6 7 8 9 10 11 12
13 14 15 16 17 18 19 20 1 2 3 4
And, of course, you can sum the rows of this structure, which in APL is known as a "tensor":
+/2 3 4 ⍴ a 10 26 42 58 74 10
This time without assigning to an intermediate variable, APL executes from right to left unless there are parenthesis.
So, I could do the entire thing in one shot. Generate a sequence of numbers from 1 to N, reshape them into a tensor, sum across each row and then take that result and sum across each column.
Here it is, with N = 200:
+⌿ +/ 2 3 4 ⍴ ⍳ 200
+⌿ +/ 2 3 4 ⍴ ⍳ 200 68 100 132
Of course, there are other things you can do. For example, take a 5 x 5 reshape (a matrix) of the first 25 numbers and then transpose the matrix (swap rows and columns):
5 5 ⍴ ⍳ 25 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
⍉ 5 5 ⍴ ⍳ 25 1 6 11 16 21 2 7 12 17 22 3 8 13 18 23 4 9 14 19 24 5 10 15 20 25
That introduces a new symbol, transpose. I suspect that if you read through what I wrote above you already understand what the first statement did. It really is that simple.
15 > ⍉ 5 5 ⍴ ⍳ 25 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 1 1 0 0 0
It returns a new matrix with a 1 anywhere the desired condition is met.
The utility of this is easier to see if I use a variable stuffed with random numbers:
a ← 25?100 a 90 6 10 99 62 15 52 32 98 13 7 100 64 58 16 29 67 56 53 28 96 27 59 30 18
a is a vector of 25 elements with each element being a randomly chosen value between 1 and 100. We can reshape this into anything we want, but for simplicity sake I'll leave it as a vector.
I want to find and extract values less than, say, 25.
This first expression generates a new boolean vector with a 1 anywhere the condition is met:
a < 25 0 1 1 0 0 1 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1
To extract the values I can just use the above vector to "reduce" the original vector, like this:
(a < 25)/a 6 10 15 13 7 16 18
And If I want I can use reduction-product to multiply them all:
×/ (a < 25)/a 23587200
Anyhow, that's a microscopic taste of APL with a few very simple examples. I would venture a guess that, if you actually followed the examples, you got comfortable with the notation and nomenclature I introduced. There's a lot more, of course, and this is how one learns it, incrementally.
BTW, my last large APL project was a the software used to run a high speed DNA sequencing machine researchers used during the race to decode the human genome.
Thanks,
-Martin
On Wednesday, November 6, 2019, 04:12:19 PM PST, Abe Dillon < abedillon@gmail.com> wrote:
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup.
This distinction between notation and soup seems pretty subjective. What is the difference between soup and notation? In my mind it has a lot to do with familiarity. I watched that video about programming Conway's Game of Life in APL and it looks like an incomprehensible soup of symbols to me.
Another example of ASCII soup is regex.
That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read and I have to crack open a reference because I can never remember the notation. Luckily someone came up with a solution called verbal expressions <https://github.com/VerbalExpressions/PythonVerbalExpressions> which trade hard-to-remember symbols with easy to understand words! (though I think the Python implementation smacks of Java idioms)
I'm sure there are people who work with regular expressions on such a regular basis that they've become fluent, but when you require such deep emersion in the language before the symbols make sense to you, it's a huge barrier to entry. You can't then act all confused about why your favorite language never caught on.
Without real notation one introduces a huge cognitive load.
What is "real notation". This sounds like a no-true-Scotsman fallacy <https://en.wikipedia.org/wiki/No_true_Scotsman>. Has everyone on this message board been communicating with fake ASCII notation this entire time? Cognative load can come from many different places like:
1. Having to remember complex key combinations just to get your thoughts into code 2. Having to memorize what each of thousands of symbols do because there's no way to look them up in a search engine 3. Knowing no other notation system that even slightly resembles APL. I mean, I know some esoteric mathematics, but I've never seen anything that looks even remotely like: life←{↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵}
A big part of Python's philosophy is that you read code way more often than you write code so we should optimize readability. As one Reddit commentor put it <https://www.reddit.com/r/programming/comments/1vlbhx/sudoku_solving_program_in_apl_that_blows_my_mind/> :
APL is such a powerful language. APL is also a powerfully write-only language.
And I don't even fully agree there because it somehow manages to be almost as difficult to write.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again.
For some people. I, myself; have a learning disability and often need to look at my keyboard. The relationship between "←" and "[" doesn't seem obvious at all.
If you download NARS2000 right now you will know how to enter "←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
You know what's even simpler and requires even less cognitive load? Typing ASCII characters...
The other interesting thing about notation is that it transcends language.
The word "notation" refers to symbols, abbreviations, and short-hand that make up domain-specific languages. Nothing about notation "transcends" language, notation is a component of language. Math is the study of patterns. Mathematical notation is what we use to write the language of patterns, to describe different patterns and communicate ideas about patterns. There used to be different mathematical languages based on culture, just like spoken languages. There's no magical property that made Roman numerals or Arabic numerals just make sense to people from other cultures, they had to learn each others notation just like any other language and eventually settled on Arabic numerals. Maybe things would have gone differently if the Mayans had a say.
It has been my experience that people who have not had the experience rarely get it
A pattern I've seen in my experience is that some person or group will put forth a pretty good idea, and others become dogmatic about that idea, loose sight of pragmatism, and try to push the idea beyond its practical applicability. I'm not saying this is you. I haven't yet read the Ken Iverson paper (I will). My suspicion at this point and after seeing the APL code demos is that there's probably plenty of good ideas in there, but APL doesn't strike me as pragmatic in any sense.
On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
Thanks for your feedback. A few comments:
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value
I used APL professionally for about ten years. None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
The use of notation as a tool for thought is a powerful concept that transcends programming. Mathematics is a simple example. So is music. Musical notation allows the expression of ideas and massively complex works as well as their creation. In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words. If we want to be able to express and think about programming at a higher level we need to develop a notation. As AI and ML evolve this might become more and more critical.
APL, sadly, was too early. Machines of the day were literally inadequate in almost every respect. It is amazing that the language went as far as it did. Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc. Programming with APL is a very different experience. Your mind works differently. I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time. You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language. Watch the videos I link to below for a taste of these ideas.
Anyhow, obviously the walrus operator is here to stay. I am not going to change anything. I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Typing these symbols isn't a problem at all. For example, in NARS2000, a free APL interpreter I use, the assignment operator "←" is entered simply with "Alt + [". It takes seconds to internalize this and never think about it again. If you download NARS2000 right now you will know how to enter " ←" immediately because I just told you how to do it. You will also know exactly what it does. It's that simple.
The other interesting thing about notation is that it transcends language. So far all conventional programming languages have been rooted in English. I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages. Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.
Anyhow, again, I am not going to change a thing. I am nobody in the Python world. Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity. If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant. I can only ascribe this to very few people involved in this process, if any, any real experience with APL. One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
I encourage everyone to read this Turing Award presentation:
"Notation as a Tool of Thought" by Ken Iverson, creator of APL http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
Also, if you haven't seen it, these videos is very much worth watching:
Conway's Game of Life in APL https://www.youtube.com/watch?v=a9xAKttWgP4
Suduku solver in APL https://www.youtube.com/watch?v=DmT80OseAGs
-Martin
On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil < risa2000x@gmail.com> wrote:
On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas < python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:
a ← 23
a = 23
I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python.
Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
Richard _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7O... Code of Conduct: http://python.org/psf/codeofconduct/
_______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/EFE437... Code of Conduct: http://python.org/psf/codeofconduct/
On 06/11/2019 17:05:21, Martin Euredjian via Python-ideas wrote:
One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it. Just messing with it casually isn't good enough. Lots of inquisitive people have messed with it, but they don't really understand it.
No offence, but my honest off-the-cuff reaction: The above could be interpreted as "APL is a difficult language to learn, it takes at least a year or two of real work with it in order for your brain to make the mental switch necessary to understand it. As opposed to more intuitive languages, such as ... I don't know ... I'm sure there's one beginning with P." In today's fast-moving world we can't afford that "year or two" before we are really productive. Nor to write code that other people need "a year or two" before they can read it fluently. Disclosure: I have no experience with APL. I did click on your "Notation as a Tool of Thought" link, but I didn't get very far with it before my eyes glazed over looking at all the unfamiliar symbols. (This although I call myself a mathematician of sorts - not someone who throws a fit at the sight of equations or Greek letters.) Yes, it would have been nice if <- had been included in the ASCII character set when it was developed in the 1960s; then all programming languages could use <- for assignment and = for equality. (Where is the time machine when you need it?) But regrettably, we are where we are. And having to type Alt-[ for every assignment - virtually necessitating the use of both hands - would IMO be a significant inconvenience. Best wishes Rob Cliffe
On Nov 6, 2019, at 01:46, martin_05--- via Python-ideas <python-ideas@python.org> wrote:
Still, the idea of two assignment operators just didn't sit well with me. That's when I realized I had seen this kind of a problem nearly thirty years ago, with the introduction of "J". I won't get into the details unless someone is interested, I'll just say that J turned APL into ASCII soup. It was and is ugly and it completely misses the point of the very reason APL has specialized notation; the very thing Iverson highlighted in his paper [0].
J didn’t invent having multiple related operators, J was trying to fix the problems that were created by APL having multiple related operators. You may not like its solution, but then you have to come up with a different solution that at least tries to solve them. Chris already raised the typeability problem. And let’s pretend for the sake of argument that the display problem has been solved. The remaining problem is that APL had way too many different operators. Too many operators to fit into ASCII almost inherently means too many to fit into a programmer’s head (and especially a programmer who also works in other languages and comes back to APL every so often). J attempted to solve this by making much heavier and more systematic use of operator modifiers. I don’t think it was all that successful in making the language easy to keep in your head, but it was enough to inspire other languages. We have the elementwise prefix in math languages, Haskell’s banks of operators organized as if they had modifiers even though they don’t—and, best of all, the discovery that thanks to types, in many cases you don’t actually need more operators. In Python, and in C++, I can just add two arrays with plain old +, and this is almost never confusing in practice. As it turns out, you never miss having three or four complete sets of operators; at most you miss matrix multiplication (and maybe exponentiation) and a distinction between cross and dot for vectors, so we only needed to add 0 to 2 operators rather than tripling the number of operators or adding a way to modify or combine operators. And I don’t think it’s a coincidence that most array-heavy programming these days is not done in either APL or J, or even modern languages like Julia that try to incorporate their best features, but in Python and C++ (and shader languages based on C++) that completely avoided the problem rather than creating it and then trying to solve it. And that’s why := is not a crisis for Python. Python doesn’t have way too many operators, and isn’t in danger of getting anywhere near there. Python adds one or two new operators per decade, and that only if you cheat by including things like if-else and := as operators when they actually aren’t while ignoring the removal of `. And most of the operators are words rather than symbols (if-else rather than ?:, yield and await which I’m sure APL would have found symbolic ways to spell, etc.). If we keep going at the current pace, by the end of the century, we’ll have used up either $ or ` and added one more digraph and three more keywords… which is fine. While we’re at it, when you replace both = and := with an arrow, what do you do with += and the other augmented assignments? I can’t think of a single-character symbol that visually represents that meaning. If you leave it as + followed by an arrow, or try to come up with some new digraph, now we have the worst of both worlds, Unicode soup: operators that are digraphs and not visually meaningful while also not being typeable.
On 2019-11-06 05:40, Andrew Barnert via Python-ideas wrote:
While we’re at it, when you replace both = and := with an arrow, what do you do with += and the other augmented assignments? I can’t think of a single-character symbol that visually represents that meaning. If you leave it as + followed by an arrow, or try to come up with some new digraph, now we have the worst of both worlds, Unicode soup: operators that are digraphs and not visually meaningful while also not being typeable.
There is: U+2B32 ⬲ LEFT ARROW WITH CIRCLED PLUS But there would need to be more. I didn't find any obvious for: -= -Mike
Unfortunately, my device dors not display LEFT ARROW WITH CIRCLED PLUS. Nor, obviously, write I have any way to enter it easily. On Wed, Nov 6, 2019, 2:05 PM Mike Miller <python-ideas@mgmiller.net> wrote:
On 2019-11-06 05:40, Andrew Barnert via Python-ideas wrote:
While we’re at it, when you replace both = and := with an arrow, what do you do with += and the other augmented assignments? I can’t think of a single-character symbol that visually represents that meaning. If you leave it as + followed by an arrow, or try to come up with some new digraph, now we have the worst of both worlds, Unicode soup: operators that are digraphs and not visually meaningful while also not being typeable.
There is:
U+2B32 ⬲ LEFT ARROW WITH CIRCLED PLUS
But there would need to be more. I didn't find any obvious for: -=
-Mike
_______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/RBEZYY... Code of Conduct: http://python.org/psf/codeofconduct/
On Nov 6, 2019, at 19:52, Mike Miller <python-ideas@mgmiller.net> wrote:
On 2019-11-06 05:40, Andrew Barnert via Python-ideas wrote: While we’re at it, when you replace both = and := with an arrow, what do you do with += and the other augmented assignments? I can’t think of a single-character symbol that visually represents that meaning. If you leave it as + followed by an arrow, or try to come up with some new digraph, now we have the worst of both worlds, Unicode soup: operators that are digraphs and not visually meaningful while also not being typeable.
There is:
U+2B32 ⬲ LEFT ARROW WITH CIRCLED PLUS
Once we go to Unicode and lots of operators, I doubt it’ll be long before we use circled plus for something. At which point this is pretty misleading as a meaning for “plus than assign back” rather than “circle-plus then assign back”.
But there would need to be more. I didn't find any obvious for: -=
So going to Unicode and lots of operators doesn’t actually save us from symbol soup at all (unless we want to give up functionality the language already has), it just adds new problems in top of the soup problem.
Mike Miller wrote:
There is:
U+2B32 ⬲ LEFT ARROW WITH CIRCLED PLUS
But there would need to be more.
At this point you're creating a mini-language by combining symbols, which the OP seems to be against, since he describes things like ":=" and "<-" condescendingly as "ascii soup". -- Greg
participants (12)
-
Abe Dillon
-
Alex Walters
-
Alexandre Brault
-
Andrew Barnert
-
Chris Angelico
-
David Mertz
-
Greg Ewing
-
Martin Euredjian
-
martin_05@rocketmail.com
-
Mike Miller
-
Richard Musil
-
Rob Cliffe