This distinction between notation and soup seems pretty subjective.

Yes and no.  Math looks like hieroglyphics to most people.  We are talking about professional programmers here.  In that context something like the Conway Game of Life in APL demo should inspire an interested party in exploring further.  None of the tools he uses in the demo are difficult to comprehend, particularly if you have a background in basic Linear Algebra (another foundational element of APL).  

It's like learning a language that does not use latin script, say, Hebrew or Greek.  At first nothing makes sense, yet it doesn't take very long for someone to recognize the characters, attach them to sounds and then make words, badly at first and better with time.

Note that I am not proposing a complete APL-ization of Python.  My only observation was that the judicious introduction of a single new symbol for assignment would solve the problem that now requires "=" and ":=".  This is far more elegant.  You don't break old code --ever-- even when you phase out the use of "=" in future versions because replacing "=" with the new symbol is an elementary process.


>> Another example of ASCII soup is regex.
> That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read

That's it!  You got it!  The difference is that regex looks like )(*&)(*&)(*^)(*&^ which means nothing.  Your brain has a mapping for what these symbols mean already.  Ascribing new meaning to a new mish-mash of them breaks that mental mapping and model, which means that it requires 50 or 100 times the cognitive load to process and comprehend.  You are forced to keep a truly unnatural mental stack in your head as you parse these infernal combinations of seemingly random ASCII to figure out their meaning.

Notation changes that, if anything for one simple reason:  It establishes new patterns, with punctuation and rhythm and your brain can grok that.  Don't forget that our brains have evolved amazing pattern matching capabilities, symbols, notation, take advantage of that, hence the deep and wide history of humanity using symbols to communicate.  Symbols are everywhere, from the icons on your computer and phone to the dashboard of your car, signs on the road, math, music, etc.

What is "real notation" 

Maybe the right term is "domain specific notation".  I suspect you know very well what I mean and are simply enjoying giving me a hard time.  No problem.  Thick skin on this side.

APL is such a powerful language. APL is also a powerfully write-only language.

Easy answer:  That Reddit commenter is simply ignorant.  This is silly.

> APL doesn't strike me as pragmatic in any sense.

Look, APL is, for all intents an purposes, a dead for general usage today.  Yet both IBM and Dyalog sell high end interpreters, with Dyalog getting $2,500 PER YEAR (I believe IBM is similarly priced).

https://www.dyalog.com/prices-and-licences.htm#devlicprice

https://www.ibm.com/us-en/marketplace/apl2

So, clearly this would not exist if the language was useless or if it was "write-only" as that genius on Reddit opined.

That said, outside of certain application domains I would not recommend anyone consider using APL.  The language, as I said before, was ahead of it's time and the people behind it truly sucked at marketing and expanding popularity for more reasons than I care to recount here.  I was very involved in this community in the 80's.  I knew Ken Iverson and the other top fliers in the domain.  I even published a paper back in '85 along with a presentation at an ACM/APL conference.  And still, I would say, no, not something anyone should use today.  Learn?  Yes, absolutely, definitely.  It's an eye opener but not much more than that.

Real APL development stopped a long time ago.  Maybe one day someone with the right ideas will invent NextGenerationAPL or something like that and give it a place in computing.

That is not to say that some of the concepts in APL have no place in other languages or computing.  For example, list comprehensions in Python have a very close link to the way things are done in APL.  They almost feel APL-like constructs to someone with experience in the language.

Here's another interesting APL resource that serious practitioners have used for decades (with some memorizing useful idioms):

https://aplwiki.com/FinnAplIdiomLibrary

Yes, if you program APL professionally you can read this and it does not look like ASCII soup.

For example this is a downward (largest to smallest) sort of a:

b←⍒a

      a ← ⍳20
      a
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 
      b←⍒a
      b
20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 

I just showed you another symbol, "", the index generator, which is loosely equivalent to range() in Python.

So "⍳20" and range(1,21) generate similar results.  In APL parlance, it's a vector.

The difference is that I can then shape this vector into multidimensional arrays, for example, a 5x5 matrix:

      5 5 ⍴ a
 1  2  3  4  5
 6  7  8  9 10
11 12 13 14 15
16 17 18 19 20
 1  2  3  4  5
      
That's a new operator "" or "rho" the "reshape" operator.  In this case it takes vector a and reshapes it into a 5x5 matrix (replicating data where needed.

Of course, I can assign this matrix to a new variable if I want to:

      c ← 5 5 ⍴ a

And now, if I want to sum the values across each row (horizontally) I simply do this:

      +/c
15 40 65 90 15 

Or vertically:

      +⌿c
35 40 45 50 55 


You can just as easily reshape a into a three dimensional array (two matrices, a lamination of matrices conceptually, a stack of cards where each matrix is written on each card):

      2 3 4 ⍴ a
 1  2  3  4
 5  6  7  8
 9 10 11 12

13 14 15 16
17 18 19 20
 1  2  3  4

And, of course, you can sum the rows of this structure, which in APL is known as a "tensor":

    +/2 3 4 ⍴ a
10 26 42
58 74 10

This time without assigning to an intermediate variable, APL executes from right to left unless there are parenthesis.

So, I could do the entire thing in one shot.  Generate a sequence of numbers from 1 to N, reshape them into a tensor, sum across each row and then take that result and sum across each column.

Here it is, with N = 200:

+⌿ +/ 2 3 4 ⍴ ⍳ 200

    +⌿ +/ 2 3 4 ⍴ ⍳ 200
68 100 132 

Of course, there are other things you can do.  For example, take a 5 x 5 reshape (a matrix) of the first 25 numbers and then transpose the matrix (swap rows and columns):

      5 5 ⍴ ⍳ 25
 1  2  3  4  5
 6  7  8  9 10
11 12 13 14 15
16 17 18 19 20
21 22 23 24 25

      ⍉ 5 5 ⍴ ⍳ 25
1  6 11 16 21
2  7 12 17 22
3  8 13 18 23
4  9 14 19 24
5 10 15 20 25

That introduces a new symbol, transpose.  I suspect that if you read through what I wrote above you already understand what the first statement did.  It really is that simple.

      15 > ⍉ 5 5 ⍴ ⍳ 25
1 1 1 0 0
1 1 1 0 0
1 1 1 0 0
1 1 1 0 0
1 1 0 0 0

It returns a new matrix with a 1 anywhere the desired condition is met.

The utility of this is easier to see if I use a variable stuffed with random numbers:

      a ← 25?100
      a
90 6 10 99 62 15 52 32 98 13 7 100 64 58 16 29 67 56 53 28 96 27 59 30 18 

a is a vector of 25 elements with each element being a randomly chosen value between 1 and 100.  We can reshape this into anything we want, but for simplicity sake I'll leave it as a vector.

I want to find and extract values less than, say, 25.

This first expression generates a new boolean vector with a 1 anywhere the condition is met:

      a < 25
0 1 1 0 0 1 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 

To extract the values I can just use the above vector to "reduce" the original vector, like this:

      (a < 25)/a
6 10 15 13 7 16 18 

And If I want I can use reduction-product to multiply them all:

      ×/ (a < 25)/a
23587200


Anyhow, that's a microscopic taste of APL with a few very simple examples.  I would venture a guess that, if you actually followed the examples, you got comfortable with the notation and nomenclature I introduced.  There's a lot more, of course, and this is how one learns it, incrementally.

BTW, my last large APL project was a the software used to run a high speed DNA sequencing machine researchers used during the race to decode the human genome.


Thanks,

-Martin




On Wednesday, November 6, 2019, 04:12:19 PM PST, Abe Dillon <abedillon@gmail.com> wrote:


No, using "<--" is going in the wrong direction.  We want notation, not ASCII soup.
This distinction between notation and soup seems pretty subjective. What is the difference between soup and notation? In my mind it has a lot to do with familiarity. I watched that video about programming Conway's Game of Life in APL and it looks like an incomprehensible soup of symbols to me.

Another example of ASCII soup is regex.
That's interesting, I feel the same way. I can read most code pretty quickly, but as soon as I hit a regex it takes me 50x as long to read and I have to crack open a reference because I can never remember the notation. Luckily someone came up with a solution called verbal expressions which trade hard-to-remember symbols with easy to understand words! (though I think the Python implementation smacks of Java idioms)

I'm sure there are people who work with regular expressions on such a regular basis that they've become fluent, but when you require such deep emersion in the language before the symbols make sense to you, it's a huge barrier to entry. You can't then act all confused about why your favorite language never caught on.

Without real notation one introduces a huge cognitive load.
What is "real notation". This sounds like a no-true-Scotsman fallacy. Has everyone on this message board been communicating with fake ASCII notation this entire time?
Cognative load can come from many different places like:
  1. Having to remember complex key combinations just to get your thoughts into code
  2. Having to memorize what each of thousands of symbols do because there's no way to look them up in a search engine
  3. Knowing no other notation system that even slightly resembles APL.
    I mean, I know some esoteric mathematics, but I've never seen anything that looks even remotely like:
    life{1 .3 4=+/,¯1 0 1∘.¯1 0 1∘.⌽⊂}
A big part of Python's philosophy is that you read code way more often than you write code so we should optimize readability. As one Reddit commentor put it:

APL is such a powerful language. APL is also a powerfully write-only language.

And I don't even fully agree there because it somehow manages to be almost as difficult to write. 

Typing these symbols isn't a problem at all.  For example, in NARS2000, a free APL interpreter I use, the assignment operator "" is entered simply with "Alt + [".  It takes seconds to internalize this and never think about it again.
 
For some people. I, myself; have a learning disability and often need to look at my keyboard. The relationship between "" and "[" doesn't seem obvious at all.

If you download NARS2000 right now you will know how to enter "" immediately because I just told you how to do it.  You will also know exactly what it does.  It's that simple.

You know what's even simpler and requires even less cognitive load?  Typing ASCII characters...

The other interesting thing about notation is that it transcends language.

The word "notation" refers to symbols, abbreviations, and short-hand that make up domain-specific languages. Nothing about notation "transcends" language, notation is a component of language. Math is the study of patterns. Mathematical notation is what we use to write the language of patterns, to describe different patterns and communicate ideas about patterns. There used to be different mathematical languages based on culture, just like spoken languages. There's no magical property that made Roman numerals or Arabic numerals just make sense to people from other cultures, they had to learn each others notation just like any other language and eventually settled on Arabic numerals. Maybe things would have gone differently if the Mayans had a say.

It has been my experience that people who have not had the experience rarely get it

A pattern I've seen in my experience is that some person or group will put forth a pretty good idea, and others become dogmatic about that idea, loose sight of pragmatism, and try to push the idea beyond its practical applicability. I'm not saying this is you. I haven't yet read the Ken Iverson paper (I will). My suspicion at this point and after seeing the APL code demos is that there's probably plenty of good ideas in there, but APL doesn't strike me as pragmatic in any sense.

On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Thanks for your feedback.  A few comments:

I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value

I used APL professionally for about ten years.  None of your objections ring true.  A simple example is had from mathematics.  The integral symbol conveys and represents a concept.  Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it.  It really is a simple as that, this is how our brains work.  That's how you recognize the letter "A" as to correspond to a sound and as part of words.  This is how, in languages such as Chinese, symbols, notation, are connected to meaning.  It is powerful and extremely effective.

The use of notation as a tool for thought is a powerful concept that transcends programming.  Mathematics is a simple example. So is music.  Musical notation allows the expression of ideas and massively complex works as well as their creation.  In electronics we have circuit diagrams, which are not literal depictions of circuits but rather a notation to represent them, to think about them, to invent them.

The future of computing, in my opinion, must move away --perhaps not entirely-- from ASCII-based typing of words.  If we want to be able to express and think about programming at a higher level we need to develop a notation.  As AI and ML evolve this might become more and more critical.  

APL, sadly, was too early.  Machines of the day were literally inadequate in almost every respect.  It is amazing that the language went as far as it did.  Over 30+ years I have worked with over a dozen languages, ranging from low level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the "modern" languages such as Python, JS, PHP, etc.  Programming with APL is a very different experience.  Your mind works differently.  I can only equate it to writing orchestral scores in the sense that the symbols represent very complex textures and structures that your mind learns to imagine and manipulate in real time.  You think about spinning, crunching, slicing and manipulating data structures in ways you never rally think about when using any other language.  Watch the videos I link to below for a taste of these ideas.

Anyhow, obviously the walrus operator is here to stay.  I am not going to change anything.  I personally think this is sad and a wasted opportunity to open a potentially interesting chapter in the Python story; the mild introduction of notation and a path towards evolving a richer notation over time.

> Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?

No, using "<--" is going in the wrong direction.  We want notation, not ASCII soup.  One could argue even walrus is ASCII soup.  Another example of ASCII soup is regex.  Without real notation one introduces a huge cognitive load.  Notation makes a massive difference.  Any classically trained musician sees this instantly.  If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.

Typing these symbols isn't a problem at all.  For example, in NARS2000, a free APL interpreter I use, the assignment operator "" is entered simply with "Alt + [".  It takes seconds to internalize this and never think about it again.  If you download NARS2000 right now you will know how to enter "" immediately because I just told you how to do it.  You will also know exactly what it does.  It's that simple.

The other interesting thing about notation is that it transcends language.  So far all conventional programming languages have been rooted in English.  I would argue there is no need for this when a programming notation, just like mathematical and musical notations have demonstrated that they transcend spoken languages.  Notation isn't just a tool for thought, it adds a universal element that is impossible to achieve in any other way.


Anyhow, again, I am not going to change a thing.  I am nobody in the Python world.  Just thought it would be interesting to share this perspective because I truly think this was a missed opportunity.  If elegance is of any importance, having two assignment operators when one can do the job, as well as evolve the language in the direction of an exciting and interesting new path is, at the very least, inelegant.  I can only ascribe this to very few people involved in this process, if any, any real experience with APL.  One has to use APL for real work and for at least a year or two in order for your brain to make the mental switch necessary to understand it.  Just messing with it casually isn't good enough.  Lots of inquisitive people have messed with it, but they don't really understand it.


I encourage everyone to read this Turing Award presentation:

"Notation as a Tool of Thought" by Ken Iverson, creator of APL
http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf


Also, if you haven't seen it, these videos is very much worth watching:

Conway's Game of Life in APL

Suduku solver in APL


-Martin



On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil <risa2000x@gmail.com> wrote:


On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <python-ideas@python.org> wrote:
In other words, these two things would have been equivalent in Python:

    a ← 23

    a = 23

I do not consider these two things conceptually equivalent. In Python the identifier ('a' in this case) is just label to the value, I can imagine "let 'a' point to the value of 23 now" and write it this way: "a --> 23", but "a <-- 23" does give an impression that 23 points to, or is somehow fed into, 'a'. This may give false expectations to those who are coming to Python from another language and might expect the "l-value" behavior in Python.

Second point, I can write := in two keystrokes, but I do not have a dedicated key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?

Richard
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-leave@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7OV7UJCREV5WG2OFMGPTUPGYTNB7/
Code of Conduct: http://python.org/psf/codeofconduct/