> Was your use of APL on a machine with a dedicated APL keyboard?
I've done both. In the early '80's it was not uncommon to find terminals with APL keyboards. IBM, DEC, Tektronix and other made them. Once the IBM PC era took hold most of APL was done with either a card you'd place in front of your keyboard or stickers you'd add to the front of the then thick keycaps.
Here's reality: It isn't that difficult at all to mentally map a bunch of symbols to a standard keyboard. It's a bit clunky at first but you learn very, very quickly, I would venture to guess that one could reach for the most common APL symbols with ease within a day.
How do we learn to touch-type? By typing. At first you look at the keyboard all the time. I never do any more. I am typing this by looking at the screen, I haven't looked at the keyboard even once this entire time. You can do that with APL just fine, it's easy.
When I was actively using the language every day I touch typed APL, didn't even think about it. Which is also another powerful thing. Once you get to that point expressing ideas computationally is not unlike playing music on a piano, it just flows.
I still use APL today, but mostly as a powerful calculator than anything else. Among other things, I work in robotics, where doing quick linear algebra calculations comes in handy. Other than that, APL --for good reasons-- is pretty much a dead language. That's not to say there are concepts in there that warrant consideration. The power of notation is not appreciated by most programmers because there really isn't anything like APL out there. I know people won't accept this because it is human nature to resist change or new ideas, but the truth is the way we express our ideas in computational terms is rather primitive.
It is my opinion that this is so because we are still typing words into text editors. I do not, by any means, imply that programming graphically is the solution. I do a lot of FPGA work, mostly designing complex high speed real time image processing hardware. I have tried graphical tools for FPGA work and they have never really worked well at all. In this case my go-to tool ends-up being Verilog or even lower register-level hardware description. I can't tell you what form this "next generation" approach to programming should take other than having the believe, due to my experience with APL, that the introduction of symbols would be of potentially great value.
I look at ideas such as designing and defining state machines. I've done a ton of that work in both hardware (FPGA's) and software (ranging from embedded systems in Forth, C and C++ to desktop and web applications in various languages). I've had to develop custom tools to make the task of designing, coding and maintaining such state machines easier than manually typing walls of text consisting of nested switch() statements or whatever the language allows.
A simple example of this might be a state machine driving the menu system of an embedded system with a simple multi-line LCD display and a few buttons and knobs for a control panel. I've done control panels with two dozen displays, a couple hundred buttons and to dozen encoder/knobs. Once you start looking at what it takes to design something like that, code it and then support it through iterations, feature changes and general code maintenance it becomes VERY obvious that typing words on a text editor is absolutely the worst way to do it. And yet we insist on being stuck inside an ASCII text editor for our work. From my perspective, in 2019, it's just crazy.
Another interesting example is had in some of my work with real time embedded systems. There are plenty of cases where you are doing things that are very tightly related to, for example, signals coming into the processor though a pin; by this I mean real-time operations where every clock cycle counts. One of the most critical aspects of this for me is the documentation of what one si doing and thinking. And yet, because we insist on programming in ASCII we are limited to text-based comments that don't always work well at all. In my work I would insert a comment directing the reader to access a PDF file I placed in the same directory often containing an annotated image of a waveform with notes. It would be amazing if we could move away from text-only programming and integrate a rich environment where such documentation could exist and move with the code.
Anyhow, not suggesting, by any stretch of the imagination, that these things are a necessity for Python. You asked an important and interesting question and I wanted to give you an answer that also exposed some of my perspective beyond this insignificant question of an assignment operator.
> I'd like to get some information on how much of that productivity was demonstrated on a system with a conventional
To address this directly, the case for something like APL has nothing to do with keyboard productivity. We are not taking about vim vs. a conventional text editor. That's the wrong level of abstraction. As I said above, if you are using APL professionally you will touch-type it in short order. No question about that whatsoever. The productivity gain comes from operating at a higher cognitive level while translating your thoughts into code for the machine to execute. The famous one-liner solutions are not neat because they are on-liners, they are interesting because they become idioms, words, with a meaning. Your brain sees that and knows what that line is doing. Again, this isn't what happens to a newbie, of course.
The closest Python example of this I can provide would be list comprehensions you reach for all the time. After internalizing them you don't really see the individual pieces of the list comprehension but rather the meaning. Simple example:
a = 1,2,3,4,5,6
[x**2 for x in a]
If you do this often enough you don't have to parse the list comprehension, it becomes a word with a meaning. APL is like that, with the difference being that these words are far more powerful and easily represent tens to hundreds of lines of code with conventional languages.