I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python.

I understood this just fine.  I happen to think your argument in this regard is neither sound nor valid.

Question:  Where did APL's  "" operator come from?

A number of APL's elements came from a notation developed to describe the operation of IBM processors back in the 1960's.  In many ways it meant "this name is assigned to this object", to paraphrase your statement.

I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" which is literally read "a is equal to some_object" say "a is a label that is attached to 23" or "a is a label that is attached to some_object"?

This is no different from the concept of pointers.  A pointer stores an address to some data structure somewhere.  No professional thinks that "a = some_object" results in a bucket being filled with whatever the object might contain.  It's a pointer.  We are assigning a pointer.  We are storing an address that points to where the data lives.

In fact, one could very well make the argument that using the "=" character for this operation is misleading because the left side is not set to be equal to the right side.  Even worse, these pointers in Python are inmutable.  Someone coming from a whole range of languages sees the "=" sign to mean something very different.  For example, there are a bunch of languages where incrementing or performing math on the pointer's address is normal and fundamental to the language.  So, "=" in Python is not equal to "=" in many languages.  Why are we using the same symbol and creating this confusion? 

If your response is something like "people learn the difference", well, you just made my point.  People learn.

I've had this kind of a conversation with many people in the 30+ years since I learned APL and 20+ years since I stopped using it professionally.  It has been my experience that people who have not had the experience rarely get it, and, sadly, more often than not, they become hostile to the implication that there might actually be a better way to translate ideas into computer executable code.  That's just reality and I am not going to change it.

Look, we don't have to agree, and, frankly, you seem to be getting rattled.  I want no part of that.  I didn't come here to change the Python universe.  Like I said, I am nobody, so, yeah, forget it.  Don't waste your time on me or my ridiculous ideas.  I just wanted to share an opinion, worthless as it might be.


Thanks,

-Martin



On Wednesday, November 6, 2019, 12:18:21 PM PST, Abe Dillon <abedillon@gmail.com> wrote:


I used APL professionally for about ten years.
Yes, you've stated that already.

None of your objections ring true.  A simple example is had from mathematics.  The integral symbol conveys and represents a concept.  Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it.  It really is a simple as that, this is how our brains work.  That's how you recognize the letter "A" as to correspond to a sound and as part of words.  This is how, in languages such as Chinese, symbols, notation, are connected to meaning.  It is powerful and extremely effective.
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics.

A simple example is had from mathematics.  The integral symbol conveys and represents a concept.  Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it.  It really is a simple as that, this is how our brains work.  That's how you recognize the letter "A" as to correspond to a sound and as part of words.  This is how, in languages such as Chinese, symbols, notation, are connected to meaning.  It is powerful and extremely effective.
The fact that people learn and then become comfortable with symbols doesn't imply that choosing which symbols to adopt into a language is trivial. You can follow the evolution of languages over time and find that they often eject characters that serve little use or cause confusion like the english character "thorn"

The use of notation as a tool for thought is a powerful concept that transcends programming.  Mathematics is a simple example. So is music.  Musical notation allows the expression of ideas and massively complex works as well as their creation.  In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.
You don't need to convince people of the power of abstraction or the utility of domain-specific languages. Such a general statement doesn't support the adoption of any specific change. You might as well be advocating for adding Egyptian hieroglyphics to musical notation. We don't need a lecture on the importance of abstract notation each time a new syntax is proposed.

The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words.  If we want to be able to express and think about programming at a higher level we need to develop a notation.  As AI and ML evolve this might become more and more critical.
I strongly disagree with this.
First of all, mathematical notation which programming borrows heavily from, highly favors compaction over clarity. It uses greek and latin symbols that mean different things depending on the field. It uses both left and right super and sub-scripts sometimes for naming conventions, sometimes to denote exponentiation. It uses dots and hats and expressions that sit below and/or above symbols (like in "limit" notation or summations) and all sorts of other orientations and symbol modifications that are almost impossible to look up, infix and prefix and postfix notation. It makes picking up any given mathematical paper a chore to comprehend because so much context is assumed and not readily accessible.

Why not use a more consistent notation like add(x, y) instead of x + y when we know addition is a function and all other functions (usually) follow the f(x, y) notation?
Because math is old. It predates the printing press and other tools that make more explicit and readable notation possible. It was much more important hundreds of years ago, that your ideas be expressible in a super-concise form to the detriment of readability. That's not the only reason, of course, but it is a pretty big reason. I submit that most mathematical papers would benefit from having their formulas re-written in something like a programming language with more explicit variable names and consistent notation.

As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?!

That's all I have time for, for now, I may pick this up later.

On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Thanks for your feedback.  A few comments:

I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value

I used APL professionally for about ten years.  None of your objections ring true.  A simple example is had from mathematics.  The integral symbol conveys and represents a concept.  Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it.  It really is a simple as that, this is how our brains work.  That's how you recognize the letter "A" as to correspond to a sound and as part of words.  This is how, in languages such as Chinese, symbols, notation, are connected to meaning.  It is powerful and extremely effective.

The use of notation as a tool for thought is a powerful concept that transcends programming.  Mathematics is a simple example. So is music.  Musical notation allows the expression of ideas and massively complex works as well as their creation.  In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.

The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words.  If we want to be able to express and think about programming at a higher level we need to develop a notation.  As AI and ML evolve this might become more and more critical.  

APL, sadly, was too early.  Machines of the day were literally inadequate in almost every respect.  It is amazing that the language went as far as it did.  Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc.  Programming with APL is a very different experience.  Your mind works differently.  I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time.  You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language.  Watch the videos I link to below for a taste of these ideas.

Anyhow, obviously the walrus operator is here to stay.  I am not going to change anything.  I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.

> Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?

No, using "<--" is going in the wrong direction.  We want notation, not ASCII soup.  One could argue even walrus is ASCII soup.  Another example of ASCII soup is regex.  Without real notation one introduces a huge cognitive load.  Notation makes a massive difference.  Any classically trained musician sees this instantly.  If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.

Typing these symbols isn't a problem at all.  For example, in NARS2000, a free APL interpreter I use, the assignment operator "" is entered simply with "Alt + [".  It takes seconds to internalize this and never think about it again.  If you download NARS2000 right now you will know how to enter "" immediately because I just told you how to do it.  You will also know exactly what it does.  It's that simple.

The other interesting thing about notation is that it transcends language.  So far all conventional programming languages have been rooted in English.  I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages.  Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.


Anyhow, again, I am not going to change a thing.  I am nobody in the Python world.  Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity.  If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant.  I can only ascribe this to very few people involved in this process, if any, any real experience with APL.  One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it.  Just messing with it casually isn't good enough.  Lots of inquisitive people have messed with it, but they don't really understand it.


I encourage everyone to read this Turing Award presentation:

"Notation as a Tool of Thought" by Ken Iverson, creator of APL
http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf


Also, if you haven't seen it, these videos is very much worth watching:

Conway's Game of Life in APL

Suduku solver in APL


-Martin



On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil <risa2000x@gmail.com> wrote:


On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:

    a ← 23

    a = 23

I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python.

Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?

Richard
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-leave@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7OV7UJCREV5WG2OFMGPTUPGYTNB7/
Code of Conduct: http://python.org/psf/codeofconduct/