On Thu, Nov 7, 2019 at 7:21 AM Abe Dillon <abedillon@gmail.com> wrote:
None of your objections ring true. A simple example is had from mathematics. The integral symbol conveys and represents a concept. Once the practitioner is introduced to the definition of that symbol, what it means, he or she uses it. It really is a simple as that, this is how our brains work. That's how you recognize the letter "A" as to correspond to a sound and as part of words. This is how, in languages such as Chinese, symbols, notation, are connected to meaning. It is powerful and extremely effective.
I don't think you understood the point about APL's arrow assignment operator being counterintuitive in Python. In Python: variables are names assigned to objects *not* buckets that objects are stored in. Using a notation that implies that objects are assigned to variables encourages a broken understanding of Python's mechanics.
For the record, I actually don't think that this is much of a problem. We have notions of "assignment" and "equality" that don't always have the exact same meanings in all contexts, and our brains cope. The sense in which 2/3 is equal to 4/6 is not the same as the one in which x is equal to 7, and whether or not you accept that the (infinite) sum of all powers of two is "equal to" -1 depends on your interpretation of equality. You might think that after an assignment, the variable is equal to that value - but you can say "x = x + 1" and then x will not be equal to x + 1, because assignment is temporal in nature. People will figure this out regardless of the exact spelling of either equality or assignment.
As to the role of ML and AI in all of this: These are tools that will allow greater abstraction. Assuming more symbols will greatly enhance programing in the future is like assuming that more opcodes will greatly enhance programing in the future. AI and ML, if anything, will allow us to define the problems we want to solve in something much closer to natural language and let the computers figure out how that translates to code. What kind of code? Python? C++? APL? x86? RISC-V? Who cares?!
Agreed. I would suggest, though, that this isn't going to be anything new. It's just a progression that we're already seeing - that programming languages are becoming more abstract, more distant from the bare metal of execution. Imagine a future in which we dictate to a computer what we want it to do, and then it figures out (via AI) how to do it... and now imagine what a present day C compiler does to figure out what sequence of machine language instructions will achieve the programmer's stated intention. Here's a great talk discussing the nature of JavaScript in this way: https://www.destroyallsoftware.com/talks/the-birth-and-death-of-javascript AI/ML might well be employed *already* to implement certain optimizations. I wouldn't even know; all I know is that, when I ask the computer to do something, it does it. That's really all that matters! ChrisA