Re: Python should take a lesson from APL: Walrus operator not needed
Really? Because I’ve been using ABC notation for music for decades Try writing an orchestral suite in ASCII and see how well it goes. C'mon. I know people use tablature and similar ideas. Sure. So what? The fact that I can enter Python code as plain text is even more useful than music and math.
Sure, because you speak English. Go talk to a kid who has to learn this in, I don't know, Egypt or China. Sorry, notation is far more powerful. As I said in one of my other notes, people who have not had the unique experience of using something like APL for non-trivial development work simply don't get it. This is very much a conservative reaction (I do not mean this in the political sense at all) in that people are always more comfortable with things they know staying the same. I suspect that, had I not been pulled from a FORTRAN class by my APL-evangelist Physics professor in college I would likely react similarly if someone told me it might be better to consider notation as a way to improve computational expression of ideas. I get it. I am not oblivious to the absolute fact that context is crucially important here and very few people without the appropriate context are open minded enough to consider new ideas without becoming passionately involved in a negative way. Anyhow, I'll repeat, I am nobody and I am certainly not going to change the Python world. Sleep well knowing this is just another random moron saying something he probably should not have said. I now see this is an intellectually welcoming community, which completely explains the walrus operator and other issues. Thanks anyway, -Martin On Wednesday, November 6, 2019, 02:22:00 PM PST, Andrew Barnert <abarnert@yahoo.com> wrote: On Nov 6, 2019, at 18:05, Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
No, using "<--" is going in the wrong direction. We want notation, not ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Really? Because I’ve been using ABC notation for music for decades, and it’s never felt like an incomprehensible mess to me. While it doesn’t look as nice as normal musical notation, it’s even easier to learn, and it doesn’t take that long to be able to read it. And the fact that I can enter it a lot faster—and, more importantly, that I can enter it in exactly the same way in any email client, text editor, etc.—makes it hugely useful. When I’m out somewhere and think of a riff, I can just pull out my phone, fire up the Notes app, and type in the ABC. That’s visual enough for me to get it out of my consciousness and then read it off and hear it in my head so I can come up with a bassline to go with it. Or to read back a few days later and play it (after a couple seconds of processing, which admittedly isn’t as good as musical notation—but a lot better than nothing, which is what I’d have if I insisted on musical notation or nothing). For some reason, ABC is something that only musicologists learn, not musicians, unless you happen to get lucky and find it by accident. But many other musicians come up with their own idiosyncratic systems for the same purpose. You also mentioned math. The last visual editor for math I didn’t absolutely hate was my HP42. And I’d still rather use ASCIIMathML over that, much less something like Microsoft Word’s equation editor. It’s great to have something like MathJax on-the-fly rendering, but when I don’t have that, I can read x_0, or (-b +- sqrt(b^2 – 4ac))/(2a). Some things are too complicated for ASCIIMathML, but for them, I’ll use TeX; there’s no way I’d figure it out in a symbolic editor without a whole lot of painful trial and error. The fact that I can enter Python code as plain text is even more useful than music and math. Sure, in your IDE, alt-[ types an arrow. But what happens when you type alt-[ in your iPhone mail app, in emacs or vi over an ssh session to a deployed server, in web text boxes like the the answer box on StackOveflow, in your favorite editor’s markdown mode, in a WYSIWYG word processor, or even just in someone else’s IDE? And I do all those things a lot more often with source code than I do with music or equations. How are you entering the arrows in these emails? Do you fire up your IDE, type an arrow, and then copy and paste it into your webmail or something like that? If you were on your phone instead of your laptop, would you go search the web for the Unicode arrow and copy and paste from there? Does any of that really seem as easy to you as typing := the exact same way in every editor in the world? I wouldn’t mind an editor that had the same kind of “presentation mode” as markdown and mathjax editors, and if it rendered := as an arrow, that would be fine. But I’d still want to be able to type it as :=, because as long as I have to type it that way on a mailing list, I’d rather have just := in my muscle memory than have := and alt-[ and have to remember which one to use in which context. (PS, I’ve never noticed before this that the two main text notations for music have the same name as Python’s predecessor and Python’s creator. Probably not significant…)
The fact that I can enter Python code as plain text is even more useful than music and math. Sure, because you speak English. Go talk to a kid who has to learn this in, I don't know, Egypt or China.
The same process that made mathematical notation standard across the world is happening for English right now. I'm sure there are better languages that could have become the global standard just as there may have been better notation systems that might have been better than the mathematical notation we ended up with (the use of "i" and name "imaginary numbers" is quite unfortunate in my opinion, but at least we're not using Roman numerals...). Why would you praise the ubiquity of mathematical notation and scorn English developing the same property? On Wed, Nov 6, 2019 at 6:16 PM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
Really? Because I’ve been using ABC notation for music for decades
Try writing an orchestral suite in ASCII and see how well it goes. C'mon. I know people use tablature and similar ideas. Sure. So what?
The fact that I can enter Python code as plain text is even more useful than music and math.
Sure, because you speak English. Go talk to a kid who has to learn this in, I don't know, Egypt or China.
Sorry, notation is far more powerful. As I said in one of my other notes, people who have not had the unique experience of using something like APL for non-trivial development work simply don't get it. This is very much a conservative reaction (I do not mean this in the political sense at all) in that people are always more comfortable with things they know staying the same.
I suspect that, had I not been pulled from a FORTRAN class by my APL-evangelist Physics professor in college I would likely react similarly if someone told me it might be better to consider notation as a way to improve computational expression of ideas. I get it. I am not oblivious to the absolute fact that context is crucially important here and very few people without the appropriate context are open minded enough to consider new ideas without becoming passionately involved in a negative way.
Anyhow, I'll repeat, I am nobody and I am certainly not going to change the Python world. Sleep well knowing this is just another random moron saying something he probably should not have said.
I now see this is an intellectually welcoming community, which completely explains the walrus operator and other issues.
Thanks anyway,
-Martin
On Wednesday, November 6, 2019, 02:22:00 PM PST, Andrew Barnert < abarnert@yahoo.com> wrote:
On Nov 6, 2019, at 18:05, Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
No, using "<--" is going in the wrong direction. We want notation, not
ASCII soup. One could argue even walrus is ASCII soup. Another example of ASCII soup is regex. Without real notation one introduces a huge cognitive load. Notation makes a massive difference. Any classically trained musician sees this instantly. If we replaced musical notation with sequences of two or three ASCII characters it would become an incomprehensible mess.
Really? Because I’ve been using ABC notation for music for decades, and it’s never felt like an incomprehensible mess to me. While it doesn’t look as nice as normal musical notation, it’s even easier to learn, and it doesn’t take that long to be able to read it. And the fact that I can enter it a lot faster—and, more importantly, that I can enter it in exactly the same way in any email client, text editor, etc.—makes it hugely useful. When I’m out somewhere and think of a riff, I can just pull out my phone, fire up the Notes app, and type in the ABC. That’s visual enough for me to get it out of my consciousness and then read it off and hear it in my head so I can come up with a bassline to go with it. Or to read back a few days later and play it (after a couple seconds of processing, which admittedly isn’t as good as musical notation—but a lot better than nothing, which is what I’d have if I insisted on musical notation or nothing).
For some reason, ABC is something that only musicologists learn, not musicians, unless you happen to get lucky and find it by accident. But many other musicians come up with their own idiosyncratic systems for the same purpose.
You also mentioned math. The last visual editor for math I didn’t absolutely hate was my HP42. And I’d still rather use ASCIIMathML over that, much less something like Microsoft Word’s equation editor. It’s great to have something like MathJax on-the-fly rendering, but when I don’t have that, I can read x_0, or (-b +- sqrt(b^2 – 4ac))/(2a). Some things are too complicated for ASCIIMathML, but for them, I’ll use TeX; there’s no way I’d figure it out in a symbolic editor without a whole lot of painful trial and error.
The fact that I can enter Python code as plain text is even more useful than music and math.
Sure, in your IDE, alt-[ types an arrow. But what happens when you type alt-[ in your iPhone mail app, in emacs or vi over an ssh session to a deployed server, in web text boxes like the the answer box on StackOveflow, in your favorite editor’s markdown mode, in a WYSIWYG word processor, or even just in someone else’s IDE? And I do all those things a lot more often with source code than I do with music or equations.
How are you entering the arrows in these emails? Do you fire up your IDE, type an arrow, and then copy and paste it into your webmail or something like that? If you were on your phone instead of your laptop, would you go search the web for the Unicode arrow and copy and paste from there?
Does any of that really seem as easy to you as typing := the exact same way in every editor in the world?
I wouldn’t mind an editor that had the same kind of “presentation mode” as markdown and mathjax editors, and if it rendered := as an arrow, that would be fine. But I’d still want to be able to type it as :=, because as long as I have to type it that way on a mailing list, I’d rather have just := in my muscle memory than have := and alt-[ and have to remember which one to use in which context.
(PS, I’ve never noticed before this that the two main text notations for music have the same name as Python’s predecessor and Python’s creator. Probably not significant…) _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/6LXL7Q... Code of Conduct: http://python.org/psf/codeofconduct/
On Thu, Nov 7, 2019 at 11:15 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Anyhow, I'll repeat, I am nobody and I am certainly not going to change the Python world. Sleep well knowing this is just another random moron saying something he probably should not have said.
You keep saying that, yet you posted to Python-Ideas about a "missed opportunity". Clearly you have an opinion on what the language *should* have done. If you truly believed your opinion to be meaningless, you would have posted this rant onto Facebook or something instead :) Respect yourself as much as you respect everyone else. We can discuss and debate even though this *exact* ship has sailed; there will be future questions of language design. :) ChrisA
On Thu, 7 Nov 2019 at 00:16, Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Sorry, notation is far more powerful. As I said in one of my other notes, people who have not had the unique experience of using something like APL for non-trivial development work simply don't get it.
Was your use of APL on a machine with a dedicated APL keyboard? I know the old IBM terminals had dedicated APL symbol keys. The point has been made here a few times that typing extended characters reliably is hard, but you haven't as yet responded to that (as far as I can see). While I haven't used APL much, I do a reasonable amount of mathematical writing, and I find that it's frustratingly difficult to express mathematical ideas on a computer, because the need to find ways of writing the notation (whether it's by looking up Unicode symbols, or remembering notation like LaTeX) breaks the flow of ideas. So while I won't dispute that writing APL may have been highly productive for you, I'd like to get some information on how much of that productivity was demonstrated on a system with a conventional keyboard. Without good evidence that the productivity gains you're suggesting can be achieved on the input devices that real-world Python users have available, one of your main arguments in favour of this change is significantly weakened. Paul
Was your use of APL on a machine with a dedicated APL keyboard? I've done both. In the early '80's it was not uncommon to find terminals with APL keyboards. IBM, DEC, Tektronix and other made them. Once the IBM PC era took hold most of APL was done with either a card you'd place in front of your keyboard or stickers you'd add to the front of the then thick keycaps. Here's reality: It isn't that difficult at all to mentally map a bunch of symbols to a standard keyboard. It's a bit clunky at first but you learn very, very quickly, I would venture to guess that one could reach for the most common APL symbols with ease within a day. How do we learn to touch-type? By typing. At first you look at the keyboard all the time. I never do any more. I am typing this by looking at the screen, I haven't looked at the keyboard even once this entire time. You can do that with APL just fine, it's easy.
When I was actively using the language every day I touch typed APL, didn't even think about it. Which is also another powerful thing. Once you get to that point expressing ideas computationally is not unlike playing music on a piano, it just flows. I still use APL today, but mostly as a powerful calculator than anything else. Among other things, I work in robotics, where doing quick linear algebra calculations comes in handy. Other than that, APL --for good reasons-- is pretty much a dead language. That's not to say there are concepts in there that warrant consideration. The power of notation is not appreciated by most programmers because there really isn't anything like APL out there. I know people won't accept this because it is human nature to resist change or new ideas, but the truth is the way we express our ideas in computational terms is rather primitive. It is my opinion that this is so because we are still typing words into text editors. I do not, by any means, imply that programming graphically is the solution. I do a lot of FPGA work, mostly designing complex high speed real time image processing hardware. I have tried graphical tools for FPGA work and they have never really worked well at all. In this case my go-to tool ends-up being Verilog or even lower register-level hardware description. I can't tell you what form this "next generation" approach to programming should take other than having the believe, due to my experience with APL, that the introduction of symbols would be of potentially great value. I look at ideas such as designing and defining state machines. I've done a ton of that work in both hardware (FPGA's) and software (ranging from embedded systems in Forth, C and C++ to desktop and web applications in various languages). I've had to develop custom tools to make the task of designing, coding and maintaining such state machines easier than manually typing walls of text consisting of nested switch() statements or whatever the language allows. A simple example of this might be a state machine driving the menu system of an embedded system with a simple multi-line LCD display and a few buttons and knobs for a control panel. I've done control panels with two dozen displays, a couple hundred buttons and to dozen encoder/knobs. Once you start looking at what it takes to design something like that, code it and then support it through iterations, feature changes and general code maintenance it becomes VERY obvious that typing words on a text editor is absolutely the worst way to do it. And yet we insist on being stuck inside an ASCII text editor for our work. From my perspective, in 2019, it's just crazy. Another interesting example is had in some of my work with real time embedded systems. There are plenty of cases where you are doing things that are very tightly related to, for example, signals coming into the processor though a pin; by this I mean real-time operations where every clock cycle counts. One of the most critical aspects of this for me is the documentation of what one si doing and thinking. And yet, because we insist on programming in ASCII we are limited to text-based comments that don't always work well at all. In my work I would insert a comment directing the reader to access a PDF file I placed in the same directory often containing an annotated image of a waveform with notes. It would be amazing if we could move away from text-only programming and integrate a rich environment where such documentation could exist and move with the code. Anyhow, not suggesting, by any stretch of the imagination, that these things are a necessity for Python. You asked an important and interesting question and I wanted to give you an answer that also exposed some of my perspective beyond this insignificant question of an assignment operator.
I'd like to get some information on how much of that productivity was demonstrated on a system with a conventionalkeyboard. To address this directly, the case for something like APL has nothing to do with keyboard productivity. We are not taking about vim vs. a conventional text editor. That's the wrong level of abstraction. As I said above, if you are using APL professionally you will touch-type it in short order. No question about that whatsoever. The productivity gain comes from operating at a higher cognitive level while translating your thoughts into code for the machine to execute. The famous one-liner solutions are not neat because they are on-liners, they are interesting because they become idioms, words, with a meaning. Your brain sees that and knows what that line is doing. Again, this isn't what happens to a newbie, of course. The closest Python example of this I can provide would be list comprehensions you reach for all the time. After internalizing them you don't really see the individual pieces of the list comprehension but rather the meaning. Simple example:
a = 1,2,3,4,5,6 [x**2 for x in a] [1, 4, 9, 16, 25, 36] If you do this often enough you don't have to parse the list comprehension, it becomes a word with a meaning. APL is like that, with the difference being that these words are far more powerful and easily represent tens to hundreds of lines of code with conventional languages. Thanks, -Martin
On Fri, Nov 8, 2019 at 5:40 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Was your use of APL on a machine with a dedicated APL keyboard?
I've done both. In the early '80's it was not uncommon to find terminals with APL keyboards. IBM, DEC, Tektronix and other made them. Once the IBM PC era took hold most of APL was done with either a card you'd place in front of your keyboard or stickers you'd add to the front of the then thick keycaps.
Here's reality: It isn't that difficult at all to mentally map a bunch of symbols to a standard keyboard. It's a bit clunky at first but you learn very, very quickly, I would venture to guess that one could reach for the most common APL symbols with ease within a day.
Here's another very VERY important reality: once you've done something for multiple years, you are usually *terrible* at estimating how difficult it truly is. Your brain understands what you're doing and has no difficulty with it, so you think that it's an easy thing to do. Is it? Maybe; maybe not. But unless you have watched a brand new APL programmer, you can't see how hard it actually is. Or in this case, perhaps not a brand-new APL programmer, but someone who has (say) six months of experience.
How do we learn to touch-type? By typing. At first you look at the keyboard all the time. I never do any more. I am typing this by looking at the screen, I haven't looked at the keyboard even once this entire time. You can do that with APL just fine, it's easy.
When I was actively using the language every day I touch typed APL, didn't even think about it. Which is also another powerful thing. Once you get to that point expressing ideas computationally is not unlike playing music on a piano, it just flows.
And I do the same with the operators that you disparagingly call "ASCII soup". I touch type them. What's the difference, other than that I can transfer my knowledge of typing English?
It is my opinion that this is so because we are still typing words into text editors. I do not, by any means, imply that programming graphically is the solution. I do a lot of FPGA work, mostly designing complex high speed real time image processing hardware. I have tried graphical tools for FPGA work and they have never really worked well at all. In this case my go-to tool ends-up being Verilog or even lower register-level hardware description. I can't tell you what form this "next generation" approach to programming should take other than having the believe, due to my experience with APL, that the introduction of symbols would be of potentially great value.
Oh you're absolutely right that graphical programming is not the solution. We're still typing words *because typing words is still the best way to do things*. There have been many MANY point-and-click programming tools developed (the first one I ever met was back in the 90s, a codegen tool in VX-REXX), and while they are spectacular tools for bringing a smidgen of programming to a non-programmer (say, giving an artist the ability to drag and drop instruction blocks around to create a complex animation sequence), they are *not* a replacement for text-based coding.
A simple example of this might be a state machine driving the menu system of an embedded system with a simple multi-line LCD display and a few buttons and knobs for a control panel. I've done control panels with two dozen displays, a couple hundred buttons and to dozen encoder/knobs. Once you start looking at what it takes to design something like that, code it and then support it through iterations, feature changes and general code maintenance it becomes VERY obvious that typing words on a text editor is absolutely the worst way to do it. And yet we insist on being stuck inside an ASCII text editor for our work. From my perspective, in 2019, it's just crazy.
No actually, it's not so obvious to me. Convince me. Show me that typing words is "absolutely the worst".
Another interesting example is had in some of my work with real time embedded systems. There are plenty of cases where you are doing things that are very tightly related to, for example, signals coming into the processor though a pin; by this I mean real-time operations where every clock cycle counts. One of the most critical aspects of this for me is the documentation of what one si doing and thinking. And yet, because we insist on programming in ASCII we are limited to text-based comments that don't always work well at all. In my work I would insert a comment directing the reader to access a PDF file I placed in the same directory often containing an annotated image of a waveform with notes. It would be amazing if we could move away from text-only programming and integrate a rich environment where such documentation could exist and move with the code.
Ehh, if you want to use Markdown in your comments, then sure. You could even have an editor that shows them nicely. But the code itself isn't a PDF.
The productivity gain comes from operating at a higher cognitive level while translating your thoughts into code for the machine to execute. The famous one-liner solutions are not neat because they are on-liners, they are interesting because they become idioms, words, with a meaning. Your brain sees that and knows what that line is doing. Again, this isn't what happens to a newbie, of course.
So with APL, you expect that multi-token one-liners become "words" of a sort, yet you decry ":=" in Python as "ASCII soup". Please, what *is* the difference? Define what is and isn't a token. Define what does and doesn't sit within your brain. I am unable to distinguish them. In both cases, you start with primitives and combine them, and your brain learns the combinations as new primitives. The only difference is that Python's notation can be typed on (pretty much) any keyboard, whereas APL's notation needs to be mapped specifically to each keyboard, and you need a dedicated editor that understands things. (See earlier in the thread where it was pointed out that some non-English keyboards use the Alt key to type "[", making the key combination Alt-[ impractical or impossible.) ChrisA
On Thu, 7 Nov 2019 at 18:59, Chris Angelico <rosuav@gmail.com> wrote:
On Fri, Nov 8, 2019 at 5:40 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Was your use of APL on a machine with a dedicated APL keyboard?
I've done both. In the early '80's it was not uncommon to find terminals with APL keyboards. IBM, DEC, Tektronix and other made them. Once the IBM PC era took hold most of APL was done with either a card you'd place in front of your keyboard or stickers you'd add to the front of the then thick keycaps.
Here's reality: It isn't that difficult at all to mentally map a bunch of symbols to a standard keyboard. It's a bit clunky at first but you learn very, very quickly, I would venture to guess that one could reach for the most common APL symbols with ease within a day.
Here's another very VERY important reality: once you've done something for multiple years, you are usually *terrible* at estimating how difficult it truly is. Your brain understands what you're doing and has no difficulty with it, so you think that it's an easy thing to do. Is it? Maybe; maybe not. But unless you have watched a brand new APL programmer, you can't see how hard it actually is. Or in this case, perhaps not a brand-new APL programmer, but someone who has (say) six months of experience.
And another very important reality here is that Python is used by a lot of people who would not class themselves as professional programmers. It's used by schoolchildren to learn about computers. It's used by graphic designers as an embedded language. It's used by gamers writing mods for games. The list goes on. Many of those people have NO INTEREST in learning to program Python efficiently. An awful lot won't learn any Python, they'll just copy some code off the web and fiddle with it to get the results they want. They just want to get a job done, and for them, even a single non-standard character is probably a major barrier. They certainly aren't going to put stickers on their keys, or use a reference card to know how to type operators. To be blunt, there's good reasons APL never took off with the general programming community. If we are to learn any lessons about the good features in APL, we need to understand those reasons and accept their validity first. And I'm pretty certain that "weird character set" would turn out to be one of them... Paul
On 2019-11-07 20:30, Paul Moore wrote:
On Thu, 7 Nov 2019 at 18:59, Chris Angelico <rosuav@gmail.com> wrote:
On Fri, Nov 8, 2019 at 5:40 AM Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
Was your use of APL on a machine with a dedicated APL keyboard?
I've done both. In the early '80's it was not uncommon to find terminals with APL keyboards. IBM, DEC, Tektronix and other made them. Once the IBM PC era took hold most of APL was done with either a card you'd place in front of your keyboard or stickers you'd add to the front of the then thick keycaps.
Here's reality: It isn't that difficult at all to mentally map a bunch of symbols to a standard keyboard. It's a bit clunky at first but you learn very, very quickly, I would venture to guess that one could reach for the most common APL symbols with ease within a day.
Here's another very VERY important reality: once you've done something for multiple years, you are usually *terrible* at estimating how difficult it truly is. Your brain understands what you're doing and has no difficulty with it, so you think that it's an easy thing to do. Is it? Maybe; maybe not. But unless you have watched a brand new APL programmer, you can't see how hard it actually is. Or in this case, perhaps not a brand-new APL programmer, but someone who has (say) six months of experience.
And another very important reality here is that Python is used by a lot of people who would not class themselves as professional programmers. It's used by schoolchildren to learn about computers. It's used by graphic designers as an embedded language. It's used by gamers writing mods for games. The list goes on. Many of those people have NO INTEREST in learning to program Python efficiently. An awful lot won't learn any Python, they'll just copy some code off the web and fiddle with it to get the results they want. They just want to get a job done, and for them, even a single non-standard character is probably a major barrier. They certainly aren't going to put stickers on their keys, or use a reference card to know how to type operators.
To be blunt, there's good reasons APL never took off with the general programming community. If we are to learn any lessons about the good features in APL, we need to understand those reasons and accept their validity first. And I'm pretty certain that "weird character set" would turn out to be one of them...
There was a version of APL on the Sinclair QL, which, IIRC, replaced the symbols with keywords. I don't know how well it did.
On Nov 7, 2019, at 22:35, MRAB <python@mrabarnett.plus.com> wrote:
There was a version of APL on the Sinclair QL, which, IIRC, replaced the symbols with keywords. I don't know how well it did.
The OP started the thread complaining about J, which is a much more systematic ASCIIfication of APL carefully designed by half the core APL team. If he hates that, I’m pretty sure he wouldn’t be happy with Sinclair APL.
On Nov 7, 2019, at 19:59, Chris Angelico <rosuav@gmail.com> wrote:
And I do the same with the operators that you disparagingly call "ASCII soup". I touch type them. What's the difference, other than that I can transfer my knowledge of typing English?
Well, there’s also the fact that you can touch type them into Mail.app and pine and a StackOverflow answer box and a blog comment and a general purpose text editor and get the exact same result you get in PyCharm or PyDev. That’s a pretty huge difference, which the OP is ignoring.
On Fri, Nov 8, 2019 at 11:16 AM Andrew Barnert via Python-ideas <python-ideas@python.org> wrote:
On Nov 7, 2019, at 19:59, Chris Angelico <rosuav@gmail.com> wrote:
And I do the same with the operators that you disparagingly call "ASCII soup". I touch type them. What's the difference, other than that I can transfer my knowledge of typing English?
Well, there’s also the fact that you can touch type them into Mail.app and pine and a StackOverflow answer box and a blog comment and a general purpose text editor and get the exact same result you get in PyCharm or PyDev. That’s a pretty huge difference, which the OP is ignoring.
Well - yes. However, if I were to have need of regularly typing certain tokens, I would set them up with my Compose key, which works with (nearly) every X11 app. But not everyone has a convenient way to type arbitrary characters across all apps. ChrisA
Andrew Barnert via Python-ideas writes:
On Nov 7, 2019, at 19:59, Chris Angelico <rosuav@gmail.com> wrote:
And I do the same with the operators that you disparagingly call "ASCII soup". I touch type them. What's the difference, other than that I can transfer my knowledge of typing English?
Well, there’s also the fact that you can touch type them into Mail.app and pine and a StackOverflow answer box and a blog comment and a general purpose text editor and get the exact same result you get in PyCharm or PyDev. That’s a pretty huge difference, which the OP is ignoring.
I don't think there would be a difference nowadays. I type Japanese encoded in Unicode into Mail.app and Emacs and Nano and web forms all the time; they're perfectly happy with Unicode. The question is what will the *audience* do with those unfamiliar symbols? Steve
On Nov 10, 2019, at 08:00, Stephen J. Turnbull <turnbull.stephen.fw@u.tsukuba.ac.jp> wrote:
Andrew Barnert via Python-ideas writes:
On Nov 7, 2019, at 19:59, Chris Angelico <rosuav@gmail.com> wrote:
And I do the same with the operators that you disparagingly call "ASCII soup". I touch type them. What's the difference, other than that I can transfer my knowledge of typing English?
Well, there’s also the fact that you can touch type them into Mail.app and pine and a StackOverflow answer box and a blog comment and a general purpose text editor and get the exact same result you get in PyCharm or PyDev. That’s a pretty huge difference, which the OP is ignoring.
I don't think there would be a difference nowadays. I type Japanese encoded in Unicode into Mail.app and Emacs and Nano and web forms all the time; they're perfectly happy with Unicode.
Sure, they’re all happy with Unicode; the question is how you type it. On a Mac, I can use a system input method to type Japanese characters, and it works the same way in every app (including the console), but iOS and Windows and X11 have different input methods, and if I sit down at someone else’s laptop it’s not likely to have the IM configured. (Or I can use emacs native IMs, which work across platforms, but only work within emacs.) But it’s worse than that. MacOS, iOS, Android, Windows, and most Linux distros come with a Japanese IM that you just have to know how to enable and configure, but I don’t think any of them come with an APL symbol IM. So if I want to type APL source code, I have to find a third-party IM, or maybe even write one. (Or I have to buy an APL IDE or configure emacs, and not be able to type code anywhere else, including on my phone.) And that wouldn’t change if Python used the APL arrow for assignment. I’m sure PyCharm would have a shortcut for it, and anyone who uses emacs could figure out how to set that up, and Pythonista would have an arrow key on its bar above the virtual keyboard, but none of that would enable me to type Python code into an email message or a StackOverflow answer or even a terminal REPL session.
The question is what will the *audience* do with those unfamiliar symbols?
If we’re talking about APL-like symbol density, that’s an issue. But just adding the arrow for assignment wouldn’t commit us to that. It’s possible that the happy medium of readability lies somewhere between Python and APL, and that’s what Python would approach over the next decade. (Python already approximates a hybrid between English pseudocode and math notation; it would just be moving along that spectrum.) And the audience would have no problem with that—novices would learn the arrow for assignment in the very first lesson, and they’d learn how to use iota instead of range (and the best practices for when to do so) a few lessons in, and so on. If it really is more readable, people would learn to read it. Of course it’s arguable that Python is already so close to the sweet spot that there’s no benefit to be gained there. But I don’t think that’s something that’s immediately self-evident. If it weren’t for the input problem, arrow might well be better than equals, a few extra operators might be helpful, etc. It’s the input that dooms that possibility, not the readability.
Martin Euredjian via Python-ideas writes:
Another interesting example is had in some of my work with real time embedded systems. There are plenty of cases where you are doing things that are very tightly related to, for example, signals coming into the processor though a pin; by this I mean real-time operations where every clock cycle counts. One of the most critical aspects of this for me is the documentation of what one si doing and thinking. And yet, because we insist on programming in ASCII we are limited to text-based comments that don't always work well at all. In my work I would insert a comment directing the reader to access a PDF file I placed in the same directory often containing an annotated image of a waveform with notes.
This has nothing to do with representation or input via text, though. Emacs has displayed formatted output along with or instead of code since the mid-90s or so, invariably based on a plain-text protocol and file format. (The point is not that Emacs rulez, although it does. ;-) It's that a program available to anyone on commodity hardware could do it. I imagine the capability goes back to the Xerox Alto in the mid-70s.) > It would be amazing if we could move away from text-only > programming and integrate a rich environment where such > documentation could exist and move with the code. We've been *able* to do so for decades (when was WYSIWYG introduced?), and even if you hate Emacs, there are enough people and enough variation among them that if this were really an important thing, it would be important in Emacs. It's not. Note that I do not deny that it is a *thing*. I just can't agree that it would be amazing -- I would be neither surprised nor particularly overwhelmingly enabled by it, and I don't know anybody else who would find it especially enabling. (Again, I don't deny that *you* would.)
The famous one-liner solutions are not neat because they are on-liners, they are interesting because they become idioms, words, with a meaning. Your brain sees that and knows what that line is doing.
This is absolutely true. As you point out earlier, it's also true of *fingers*, and it's really no harder to type the ligature "<-" than it is to type the chord "Alt-Meta-DoubleBucky-<" (unless you're a professional pianist). I agree with those who say that "←" is hard to read, but I think that's a font issue: we should fix the fonts. Keyboards are harder. Labels aren't enough: it really annoys me when I have to look at the keyboard for some rarely typed Japanese characters whose Roman input format I don't know.
Again, this isn't what happens to a newbie, of course.
Of course it does. That's exactly why I have always consistently flubbed certain phrases in old hymns translated to Japanese: the use of particles (suffixes equivalent to English prepositions) has changed over time. I don't know how natives handle it, but to me it's really difficult precisely because I have a limited repertoire of idioms, I can't even read the older versions easily, even though they use the modern glyphs. (Bach German lyrics in Fraktur are perversely easier *because* I don't have any German idioms!) Japanese members of my choirs are sympathetic, but they also don't understand why I don't "get better". *They* catch on fast enough, and my Japanese is good enough to occasionally fool people on the phone. :-)
The closest Python example of this I can provide would be list comprehensions you reach for all the time.
I think even closer is the walrus operator itself. But there are Python counterexamples, as well. Some of them are arguably compromises that satisfy nobody (lambdas that can wrap expressions but not suites). Others are debatable (I don't know how he feels now, but there was a time that Guido wrote he wished he could deprecate augmented assignments in favor of a compiler optimization), and Guido has never been able to accept increment operators, though for at least the first decade I participated here they were a frequent request. I would offer the Lisp family as a higher-level counterexample. In Lisp programming, I at least tend to avoid creating idioms in favor of named functions and macros, or flet and labels (their local versions). (Curiously, I guess, I do make liberal use of lambdas as arguments to functionals rather than use a label.) I'm beginning to regret using the words "idiom" and "counterexample" (even though they were introduced by others), but I don't have better ones. By "counterexample", I think that what I'm trying to argue here is that yes, human beings do have the ability, and a very useful ability, to create and use idioms. But these idioms can also be usefully compressed into names in some contexts. The choice of which to use is a matter of style, of course, but also a matter of audience. Use of names is more compact, and among "experts" (ie, those who already know and use those names) they can express great complexity in a form compact enough to hold in one's head. The problem with the word "idiom" is that in natural languages idiomatic expressions need to be explained. They carry meaning beyond the literal words and grammar used. What's different about programming "idioms" is that *they do not need to be explained* -- they carry exactly the meaning the language gives them. They still have the advantage that they're read as "words" by experts, yet they're intelligible to those who have not yet acquired them as idioms. In that sense, computer idioms are more powerful than those of natural language! Python tries to compromise by providing limited vocabulary and syntax with powerful naming facilities, ie, functions and classes. I'm not sure that allowing additional math operators from the Unicode repertoire would be such a bad thing, but such compromises are by nature delicate balances. I think it would be better to create a separate Python-like language with the feature. That's been quite successful in the Lisp family, I think. (FVO of "success" being "allows the community to evolve the language without destroying it", not "corners the market in cool kids who use the language". ;-) Regards, Steve
On Nov 10, 2019, at 08:23, Stephen J. Turnbull <turnbull.stephen.fw@u.tsukuba.ac.jp> wrote:
Martin Euredjian via Python-ideas writes:
Another interesting example is had in some of my work with real time embedded systems. There are plenty of cases where you are doing things that are very tightly related to, for example, signals coming into the processor though a pin; by this I mean real-time operations where every clock cycle counts. One of the most critical aspects of this for me is the documentation of what one si doing and thinking. And yet, because we insist on programming in ASCII we are limited to text-based comments that don't always work well at all. In my work I would insert a comment directing the reader to access a PDF file I placed in the same directory often containing an annotated image of a waveform with notes.
This has nothing to do with representation or input via text, though. Emacs has displayed formatted output along with or instead of code since the mid-90s or so, invariably based on a plain-text protocol and file format. (The point is not that Emacs rulez, although it does. ;-) It's that a program available to anyone on commodity hardware could do it. I imagine the capability goes back to the Xerox Alto in the mid-70s.)
It would be amazing if we could move away from text-only programming and integrate a rich environment where such documentation could exist and move with the code.
We've been *able* to do so for decades (when was WYSIWYG introduced?), and even if you hate Emacs, there are enough people and enough variation among them that if this were really an important thing, it would be important in Emacs. It's not.
Well, we know that some of this really is important because thousands of people are already doing it with Jupyter/iPython notebooks, and with Mathematica/Wolfram, and so on. Not only can I paste images in as comments between the cells, I can have a cell that displays a graph or image inline in the notebook, or (with SymPy) an equation in nice math notation. And all of that gets saved together with the code in the cells. I wouldn’t use it for all my code (it’s not the best way to write a web service or an AI script embedded in a game), but I believe many people who mostly do scientific computing do. But that’s how we know that we don’t need to change our languages to enable such tools. Anyone who would find it amazing is already so used to doing it every day that they’ve ceased to be amazed.
I would offer the Lisp family as a higher-level counterexample. In Lisp programming, I at least tend to avoid creating idioms in favor of named functions and macros, or flet and labels (their local versions).
Haskell might be an even more relevant example. In Haskell, you can define custom operators as arbitrary strings of the symbol characters. You can also use any function as an infix operator by putting it in backticks, and pass any operator around as a function by putting it in parens, so deciding what to call it is purely a question of whether you want your readers to see map `insert` value or map |< value. (Well, almost; you can’t control the precedence of backticked functions.) And that often involves a conscious tradeoff between how many times you need to read the code in the near future vs. how likely you are to come back to it briefly after 6 months away from it, or between how many people are going to work on the code daily vs. how many will have to read it occasionally. The |< is more readable once you internalize it, at least if you’re doing a whole lot of inserting and chaining it up with other operations. But once it’s gone out of your head, it’s one more thing to have to relearn before you can understand the code. Python effectively makes that choice for me, erring on the side of future readability. I think that’s very close to the right choice 80% of the time—and not having to make that choice makes coding easier and more fluid, even if I sacrifice a bit the other 20% of the time. This is similar to other tradeoffs that Haskell lets me make but Python makes for me (e.g., point-free or lambda—Python only has the lambda style plus explicit partial). Of course Haskell uses strings of ASCII symbols for operators. If you used strings of Unicode symbols, that would probably change the balance. I could imagine that there would be cases where no string of ASCII symbols will make sense to me in 6 months but a Unicode symbol would. (But only if we could ignore the input problem—which we can’t.) I don’t think the set of APL symbols is particularly intuitive in that sense. (If I were using it every day, a special rho variation character meaning reshape would make sense, but if I hadn’t seen it in 6 months it would be no more meaningful than any Haskell operator line noise.) But maybe there are symbols that would be more intuitively meaningful, or at least that would be closer, enough so to make the tradeoff worth considering.
This has nothing to do with representation or input via text
It does, it's an extension of the reality that, after so many decades, we are still typing words on a text editor. In other words, my comment isn't so much about the mechanics and editors that are available as much as the fact that the way we communicate and define the computational solution of problems (be it to other humans or the machine that will execute the instructions) is through typing text into some kind of a text editor. When I say "text" I mean "a through z, numbers and a few symbols that were found on mechanical typewriters in the 1960's". My shorthand for that is ASCII, which isn't quite accurate in that the set symbols contained in the sets where the most significant bits are "000" and "001" (7 bit ASCII) are not used other than CR, LF and HT. So, for the most part, programming, for the last 60 years or so --over half a century-- has been limited to the characters and symbols found on a 60 year old typewriter. For some reason this evokes the lyrics from a Pink Floyd song, "Got thirteen channels of sh** on the T.V. to choose from". The advances in computation since the 1960's have been immense, and yet we pretend that it is OK to limit ourselves to a 60 year old keyboard in describing and programming the next generation of AI systems that will reach unimaginable levels of complexity, power and capabilities. As I have mentioned in another comment, having had this experience, I fully understand how people who do not have the benefit of having communicated with computers, not just symbolically, but through a very different paradigm as well, simply cannot see what I am describing. It's hard to find an analogy that can easily represent this without some common shared perspective. I found that music can be that tool. Of course, that requires classical training at a level sufficient enough to, for example, read and "see" the music when presented with a score. Now, it's easy to say "I can do that" when presented with something like this and maybe have a rudimentary understanding of it: https://www.youtube.com/watch?v=MeaQ595tzxQ It is something quite different when presented with something like this, without a "play" button, even if annotated: http://buxtonschool.org.uk/wp-content/uploads/2017/04/Annotated-Bach-Branden... I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition. I'm already regretting having started this thread, simply because I know how this works. Frankly, it's almost like trying to engage with a religious person while trying to discuss the lack of evidence for the existence of supernatural beings. They "know" what they "know" and it is a very rare case that someone is actually going to get out of that box and comprehend what you are saying. BTW, there are some interesting efforts out there, like this: https://www.youtube.com/watch?v=1iTPLgfmFdI Once you dig into these truly interesting examples you end-up discovering that notation still has a significant edge. -Martin
Martin, I think one thing you need to realize that just being a better idea doesn't make it easy to get implemented. There is a LOT of inertia in infrastructure, and overcoming it can be nearly impossible. A great example of that is look at your typical keyboard, if you were to create a new keyboard now, it is absolutely not how you would want to organize it because it really seems illogical and inefficient. In fact, it was DESIGNED to be inefficient (that was one of its design goals, to slow typesetters down to be slower than the machine they were working on). Why do we still use it? There have been numerous attempts to replace it, and the answer is the inertia of infrastructure. It would cost WAY too much to just scrap all the existing keyboards and replace them with new (both materially and training), and the gains aren't good enough, and the costs still too high to try to phase in a transition. (Not all systems will support the new input method, as there are cost to support it, so demand needs to be proved, but if you still need to keep the skill of using a classical keyboard, a better keyboard isn't going to be that much better, so isn't worth the effort). One big issue in programming is that programmers have gotten use to being able to use a number of different tools for programming (a given programmer doesn't use many, but many are used by different programmers). This is part of the existing infrastructure. This infrastructure is largely based on simple 'ASCII' input. To make a language be based on lots of different characters not on the keyboard, it needs to either restrict the user to a small set of tools that supports the language, or there needs to be a common input method that is available on all the common tools. APL got away with this, because it started out that way, and started before people got used to having a variety of tools. It is also one reason it is relegated to being a niche language. The infrastructure argument basically makes it very hard for an existing language to move from being 'ASCII' based to being symbolic based.
On 11/11/19, 12:41 PM, Richard Damon wrote:
it was DESIGNED to be inefficient (that was one of its design goals, to slow typesetters down to be slower than the machine they were working on).
This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY -- Greg
On Mon, Nov 11, 2019, at 03:22, Greg Ewing wrote:
On 11/11/19, 12:41 PM, Richard Damon wrote:
it was DESIGNED to be inefficient (that was one of its design goals, to slow typesetters down to be slower than the machine they were working on).
This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY
This is a nice rhetorical trick: "Contrary to popular belief, the QWERTY layout was not designed to slow the typist down,[5] but rather to speed up typing by preventing jams." - well *of course* the goal was not to slow down actual production of text, but this does not imply the method by which "speeding up by preventing jams" was to be achieved was not by slowing down the physical process of pressing keys. (And the argument that having keys on alternating hands speeds things up is related to modern touch-typing techniques, and has little to do with the environment in which QWERTY was originally designed).
On 11/11/19 10:10 AM, Random832 wrote:
On Mon, Nov 11, 2019, at 03:22, Greg Ewing wrote:
On 11/11/19, 12:41 PM, Richard Damon wrote:
it was DESIGNED to be inefficient (that was one of its design goals, to slow typesetters down to be slower than the machine they were working on). This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY This is a nice rhetorical trick: "Contrary to popular belief, the QWERTY layout was not designed to slow the typist down,[5] but rather to speed up typing by preventing jams." - well *of course* the goal was not to slow down actual production of text, but this does not imply the method by which "speeding up by preventing jams" was to be achieved was not by slowing down the physical process of pressing keys. (And the argument that having keys on alternating hands speeds things up is related to modern touch-typing techniques, and has little to do with the environment in which QWERTY was originally designed).
Yes, Someone on the Internet is wrong! https://xkcd.com/386/ My memory of the full story is that, YES, putting come combinations verse putting them together let the machines go faster and perhaps let touch typist go faster, but then they often went a bit too fast even then so many of the common letters were moved from the home row or to weak fingers to slow the typist down a bit to match the machine. This was the impetus for the development of alternate keyboards, like the Dvorak, which were designed to be faster for a trained typist to use. This gets to the key point of my comment, even though the Dvorak keyboard has been show to be superior to the standard QWERTY keyboard in a number of studies (like your claims that a symbolic notation is superior to 'ASCII Soup'), a major hindrance to it being adopted is the existing infrastructure. If for some reason, every keyboard in the world was destroyed, every driver disappeared, and everyone forgot all their training on keyboard use, the replacement keyboard might well be something like the Dvorak keyboard, but that isn't going to happen. In the same way, perhaps a graphical based language might be the choice if all programming languages and tools disappeared and had to be built up fresh (but on a system reboot like that, simplicity would be important). Due to the infrastructure situation, I don't see an existing language making the jump from being 'ASCII' based to graphics based suddenly. One option is to develop and environment for programming that is 'graphical' in its entry and display, with the actual program file still the classical ASCII language, so it still interfaces with the existing tools. As that tool demonstrates improvements in programmer productivity, it would gather more users, and perhaps create the demand for such an environment be considered 'normal' and thus the language able to make moves based on that. The other option is to create a new language, perhaps based on an existing language, based on the new graphical idea. Being a fresh start, it won't be held back by existing infrastructure in its design, just in its availability. Such a language would need to build it following based on its merits, including that it use makes much of the existing tools hard to use with it. -- Richard Damon
On 11 Nov 2019, at 17:05, Richard Damon <Richard@damon-family.org> wrote:
On 11/11/19 10:10 AM, Random832 wrote:
On Mon, Nov 11, 2019, at 03:22, Greg Ewing wrote: On 11/11/19, 12:41 PM, Richard Damon wrote:
it was DESIGNED to be inefficient (that was one of its design goals, to slow typesetters down to be slower than the machine they were working on). This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY This is a nice rhetorical trick: "Contrary to popular belief, the QWERTY layout was not designed to slow the typist down,[5] but rather to speed up typing by preventing jams." - well *of course* the goal was not to slow down actual production of text, but this does not imply the method by which "speeding up by preventing jams" was to be achieved was not by slowing down the physical process of pressing keys. (And the argument that having keys on alternating hands speeds things up is related to modern touch-typing techniques, and has little to do with the environment in which QWERTY was originally designed).
Yes, Someone on the Internet is wrong! https://xkcd.com/386/
My memory of the full story is that, YES, putting come combinations verse putting them together let the machines go faster and perhaps let touch typist go faster, but then they often went a bit too fast even then so many of the common letters were moved from the home row or to weak fingers to slow the typist down a bit to match the machine. This was the impetus for the development of alternate keyboards, like the Dvorak, which were designed to be faster for a trained typist to use.
This gets to the key point of my comment, even though the Dvorak keyboard has been show to be superior to the standard QWERTY keyboard in a number of studies (like your claims that a symbolic notation is superior to 'ASCII Soup'), a major hindrance to it being adopted is the existing infrastructure.
And some studies have shown no or insignificant advantage and the original study was a fraud. I think we can safely let it go. It's way more important that there is a standard.
On 12/11/19 4:10 am, Random832 wrote:
well *of course* the goal was not to slow down actual production of text, but this does not imply the method by which "speeding up by preventing jams" was to be achieved was not by slowing down the physical process of pressing keys.
That wasn't the method, though -- the method was to ensure that frequent letter pairs were separated in the type basket, so that they were less likely to collide with each other when used in quick succession. At least that seems the most plausible explanation to me -- nobody really knows for sure. -- Greg
On Nov 10, 2019, at 20:50, Martin Euredjian via Python-ideas <python-ideas@python.org> wrote:
This has nothing to do with representation or input via text
It does, it's an extension of the reality that, after so many decades, we are still typing words on a text editor.
And how else would you want to enter code? APL is words and symbols in a text editor. Python, C++, JS, Haskell, Mathematica, Julia, etc. are also words and symbols in a text editor. The only difference is a couple dozen fewer symbols, and I’m not sure why you think that makes a transformative difference. Meanwhile, unlike APL, some of these languages have options like Jupyter notebooks and similar tools that allow you to organize that text into cells and paste images between the cells or generate them from your code or even include inline live displays, but apparently that doesn’t impress you at all. You’ve agreed graphical programming languages where you connect up components by drawing lines between them are useless for general purpose. So what exactly are you suggesting we should have instead of text? And in what way is experience with APL relevant to it?
In other words, my comment isn't so much about the mechanics and editors that are available as much as the fact that the way we communicate and define the computational solution of problems (be it to other humans or the machine that will execute the instructions) is through typing text into some kind of a text editor.
When I say "text" I mean "a through z, numbers and a few symbols that were found on mechanical typewriters in the 1960's". My shorthand for that is ASCII, which isn't quite accurate in that the set symbols contained in the sets where the most significant bits are "000" and "001" (7 bit ASCII) are not used other than CR, LF and HT.
Right, most programming languages make do with 80-odd characters, while APL uses about 100. Most of the extras being variations on letters. Although actually most languages—including Python, but not including APL—let you use a few thousand other characters for your function names and other identifiers. But apparently that isn’t interesting to you, it’s only those few dozen extra characters being used as builtins that matters. So, why?
So, for the most part, programming, for the last 60 years or so --over half a century-- has been limited to the characters and symbols found on a 60 year old typewriter.
And adding another shift key to add one more bank of a couple dozen makes a difference how? And if you want something that can input thousands of characters… well, what would that look like? Have you used CJK keyboards? They don’t have thousands of keys, because nobody could use that with human fingers. Instead, either you have a bunch of extra shifts, or you enter effectively two letters and a number for each character. That’s not any more expressive, it’s slower and clumsier.
As I have mentioned in another comment, having had this experience, I fully understand how people who do not have the benefit of having communicated with computers, not just symbolically, but through a very different paradigm as well, simply cannot see what I am describing. It's hard to find an analogy that can easily represent this without some common shared perspective. I found that music can be that tool. Of course, that requires classical training at a level sufficient enough to, for example, read and "see" the music when presented with a score.
You keep bringing up music as a comparison, but music notation has far fewer characters, and they’ve been unchanged for even longer than text punctuation. The advantage of music is a 2D notation, not more characters. And the disadvantage of music notation is the same disadvantage of everything besides text: nobody’s come up with a nice way of entering it that’s even remotely smooth enough that it doesn’t force you to think about the editor instead of the music. When I want to generate notation, I don’t use a notation editor, I play something into a sequencer, edit it in the piano roll interface, and then convert to notation and tweak a few last things, because nothing else is usable. And if I want to do it on my phone, I just can’t do it at all. Just like math, where it’s easier to read notation because it’s 2D, but the best way to create that notation is to type in code or AsciiMath or TeX and render that to an equation.
I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition. I'm already regretting having started this thread, simply because I know how this works. Frankly, it's almost like trying to engage with a religious person while trying to discuss the lack of evidence for the existence of supernatural beings.
No, what you’re trying to do is engage with a Christian by telling him that his silly 2000-year-old trinity of father, son, and holy spirit can’t possibly encompass the wonders of the universe, which you know because you worship a 1500-year-old true quadrinity of father, son, second cousin, and holy spirit. All of the wonders of APL (that Python, Haskell, Julia, etc., and especially J, can never approach) come down to a few dozen extra characters that you can type on a keyboard from the 1970s (but not much later) instead of one from the 1960s or today. It’s not that we can’t comprehend what you’re talking about, it’s that we can’t believe you seriously think this is a big deal, that somehow those few dozen extra characters mean you’re doing a whole different kind of programming. You‘re talking about weaker versions of the exact same paradigms we already have—array-based and Iterator-based and higher-order programming as found in Python and Haskell and Julia are much more powerful than the limited versions in APL, and more extensible. Sure, I spell a range with 1..20 while in APL you spell it with a variant iota character that I don’t know how to type, but why is that better? The Haskell notation is more intuitive to read, and more like mathematical notation, and extends to different starts and steps and even infinite length, and easier to type. What makes that not “true notation”?
These thousands of words of repeating claims with weird non sequitur digressions seem to amount to "I wish Python used hard-to-enter unicode characters instead of words on normal keyboards" as far as I can tell.... because human brains, apparently, cannot make sense of the two character symbol `:=` but could somehow easily comprehend the exact same operation using the back arrow that I have no idea how to enter here. And yes, I know that Martin said there was some complex chord of keys in one particular IDE that would enter that arrow character. I don't remember what the combo is, but I'm sure it's possible to learn it. FWIW, as I often point out is similar threads, I actually use vim's conceal plugin so that my Python code looks like it has funny characters inside it. But it doesn't really, the screen just shows those for certain character patterns, but the beautiful ASCII gets saved to disk as regular Python. Martin could easily use that plugin (or something similar for his own editor) to visually transform ':=' into <funny back arrow>. Just install something like that, and know that the key strokes to enter funny-back-arrow are ':' followed by '='. Simpler than that chording in whatever APL IDE even. On Sun, Nov 10, 2019 at 3:53 PM Martin Euredjian via Python-ideas < python-ideas@python.org> wrote:
This has nothing to do with representation or input via text
It does, it's an extension of the reality that, after so many decades, we are still typing words on a text editor. In other words, my comment isn't so much about the mechanics and editors that are available as much as the fact that the way we communicate and define the computational solution of problems (be it to other humans or the machine that will execute the instructions) is through typing text into some kind of a text editor.
When I say "text" I mean "a through z, numbers and a few symbols that were found on mechanical typewriters in the 1960's". My shorthand for that is ASCII, which isn't quite accurate in that the set symbols contained in the sets where the most significant bits are "000" and "001" (7 bit ASCII) are not used other than CR, LF and HT.
So, for the most part, programming, for the last 60 years or so --over half a century-- has been limited to the characters and symbols found on a 60 year old typewriter.
For some reason this evokes the lyrics from a Pink Floyd song, "Got thirteen channels of sh** on the T.V. to choose from". The advances in computation since the 1960's have been immense, and yet we pretend that it is OK to limit ourselves to a 60 year old keyboard in describing and programming the next generation of AI systems that will reach unimaginable levels of complexity, power and capabilities.
As I have mentioned in another comment, having had this experience, I fully understand how people who do not have the benefit of having communicated with computers, not just symbolically, but through a very different paradigm as well, simply cannot see what I am describing. It's hard to find an analogy that can easily represent this without some common shared perspective. I found that music can be that tool. Of course, that requires classical training at a level sufficient enough to, for example, read and "see" the music when presented with a score.
Now, it's easy to say "I can do that" when presented with something like this and maybe have a rudimentary understanding of it:
https://www.youtube.com/watch?v=MeaQ595tzxQ
It is something quite different when presented with something like this, without a "play" button, even if annotated:
http://buxtonschool.org.uk/wp-content/uploads/2017/04/Annotated-Bach-Branden...
I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition. I'm already regretting having started this thread, simply because I know how this works. Frankly, it's almost like trying to engage with a religious person while trying to discuss the lack of evidence for the existence of supernatural beings. They "know" what they "know" and it is a very rare case that someone is actually going to get out of that box and comprehend what you are saying.
BTW, there are some interesting efforts out there, like this:
https://www.youtube.com/watch?v=1iTPLgfmFdI
Once you dig into these truly interesting examples you end-up discovering that notation still has a significant edge.
-Martin
_______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/T55CUZ... Code of Conduct: http://python.org/psf/codeofconduct/
-- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th.
These thousands of words of repeating claims with weird non sequitur digressions seem to amount to <snip>
I am done with this thread. It has received nothing but close-minded hostility. Which is fine. I understand. That's the way the world works. I've seen this kind of thing happen in many domains, not just programming. If I had the power to delete this entire thread, I would. I actually regret daring to suggest there might be a different way to do things, not to change the entire language, but rather to solve the problem cleanly and elegantly with the introduction of a single symbol rather than piling on stuff. I love Python and will continue to use it, including the walrus operator. Life goes on. Admin: If you have a way to just delete this entire thread, please do so. It was a waste of time for all involved. Thank you, -Martin
I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition. I'm already regretting having started this thread, simply because I know how this works. Frankly, it's almost like trying to engage with a religious person while trying to discuss the lack of evidence for the existence of supernatural beings. They "know" what they "know" and it is a very rare case that someone is actually going to get out of that box and comprehend what you are saying.
I am done with this thread. It has received nothing but close-minded
hostility. Which is fine. I understand. That's the way the world works. I've seen this kind of thing happen in many domains, not just programming.
I intend this response in the most friendly and kind way possible: Approaching discussion in such a manner-- i.e., with the assumption that other people, who see things differently than your (in your view) grounded, logical, thought through, and deeply held standpoint, must "lack" things like experience, training, or comprehension (in the example of your interlocution with religious people) if they continue to differ with you-- could be one reason people have reacted with what you are interpreting as hostility. People often naturally react hostile way when they detect the person they are talking with believes they are lacking in some way. Furthermore, speaking as a religious person who is coming from a tradition that is deeply introspective, and has grappled for centuries with other points of view, I suggest that disparaging the ability of religious people to "comprehend" your views doesn't make the point you think it does. Might want to find a new example. My two cents. --- Ricky. "I've never met a Kentucky man who wasn't either thinking about going home or actually going home." - Happy Chandler
Hi folks, moderator here. I’d (strongly) suggest no further replies, unless there’s something Python specific to discuss. I’ll put the list into emergency moderation for a bit. thanks, —titus
On Nov 11, 2019, at 9:05 AM, Ricky Teachey <ricky@teachey.org> wrote:
I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition. I'm already regretting having started this thread, simply because I know how this works. Frankly, it's almost like trying to engage with a religious person while trying to discuss the lack of evidence for the existence of supernatural beings. They "know" what they "know" and it is a very rare case that someone is actually going to get out of that box and comprehend what you are saying.
I am done with this thread. It has received nothing but close-minded hostility. Which is fine. I understand. That's the way the world works. I've seen this kind of thing happen in many domains, not just programming.
I intend this response in the most friendly and kind way possible:
Approaching discussion in such a manner-- i.e., with the assumption that other people, who see things differently than your (in your view) grounded, logical, thought through, and deeply held standpoint, must "lack" things like experience, training, or comprehension (in the example of your interlocution with religious people) if they continue to differ with you-- could be one reason people have reacted with what you are interpreting as hostility.
People often naturally react hostile way when they detect the person they are talking with believes they are lacking in some way.
Furthermore, speaking as a religious person who is coming from a tradition that is deeply introspective, and has grappled for centuries with other points of view, I suggest that disparaging the ability of religious people to "comprehend" your views doesn't make the point you think it does. Might want to find a new example. My two cents.
--- Ricky.
"I've never met a Kentucky man who wasn't either thinking about going home or actually going home." - Happy Chandler
_______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VLSRRZ... Code of Conduct: http://python.org/psf/codeofconduct/
On 11/11/2019 17:10:40, C. Titus Brown wrote:
Hi folks,
moderator here. I’d (strongly) suggest no further replies, unless there’s something Python specific to discuss. I’ll put the list into emergency moderation for a bit.
thanks, —titus Agreed. The OP used APL for a number of years, and fell in love with it. He has translated this love into a mystical esteem for "Notation" (it's not clear what that means) and a belief that "Python ought to use left arrow instead of the walrus operator, and ultimately instead of the "=" (assignment) symbol.". There have been many knowledgeable and insightful posts to this list (much more sophisticated than I could ever have written!) _seriously_ attempting to address his concerns. But: Bottom line: Modern programmers find that the extra symbols used by APL are obscure, and by the OP's own admission take "a year or two" to become familiar. In a nutshell: APL is a dinosaur; the world has moved on. Python is always alive to adopting ideas from other programming languages. But in this case "We've heard what you say. No thank you". Rob Cliffe
On 10/11/2019 20:50, Martin Euredjian via Python-ideas wrote:
It does, it's an extension of the reality that, after so many decades, we are still typing words on a text editor. In other words, my comment isn't so much about the mechanics and editors that are available as much as the fact that the way we communicate and define the computational solution of problems (be it to other humans or the machine that will execute the instructions) is through typing text into some kind of a text editor.
You seem to be stuck on the idea that symbols (non-ASCII characters) are inherently more expressive than text, specifically that a single symbol is easier to comprehend and use than a composition of several symbols. This is a lovely theory. Unfortunately it's wrong. We don't read character by character, it turns out. We read whole lexical units in one go. So '→', ':=' and 'assign' all take the same amount of effort to recognise. What we learn to recognise them as is another matter, and familiarity counts there. I'm not a cognitive psychologist so I can't point you at any of the relevant papers for this, but I can assure you it's true. I've been through the experiments where words were flashed up on a screen for a fiftieth of a second (eye persistence time, basically), and we could all recognise them perfectly well no matter how long they were. (There probably are limits but we didn't hit them.) A quirk of my brain is that unlike my classmates I couldn't do the same with numbers -- with a very few exceptions like powers of two, numbers are just collections of digits to me in a way that words *aren't* collections of letters. Yes, I was a mathematician. Why do you ask? :-) -- Rhodri James *-* Kynesim Ltd
On 11 Nov 2019, at 15:26, Rhodri James <rhodri@kynesim.co.uk> wrote:
On 10/11/2019 20:50, Martin Euredjian via Python-ideas wrote:
It does, it's an extension of the reality that, after so many decades, we are still typing words on a text editor. In other words, my comment isn't so much about the mechanics and editors that are available as much as the fact that the way we communicate and define the computational solution of problems (be it to other humans or the machine that will execute the instructions) is through typing text into some kind of a text editor.
You seem to be stuck on the idea that symbols (non-ASCII characters) are inherently more expressive than text, specifically that a single symbol is easier to comprehend and use than a composition of several symbols. This is a lovely theory. Unfortunately it's wrong.
I'm gonna bet it's correct in some limited cases. Like APLs sort functions. Way easier to understand directly than "ascending" and "descending". Maybe it's just that I'm a non-native English speaker. But I feel the same way towards my native "stigande" and "fallande" so I don't think so.
On 2019-11-10 12:50, Martin Euredjian via Python-ideas wrote:
I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition. I'm already regretting having started this thread, simply because I know how this works.
Reminds me of the "you can't tell people anything," post: http://habitatchronicles.com/2004/04/you-cant-tell-people-anything/ "What’s going on is that without some kind of direct experience to use as a touchstone, people don’t have the context that gives them a place in their minds to put the things you are telling them." I found the thread interesting despite the many "how to type it?" replies. Don't be too discouraged. -Mike
I implemented this discussed arrow operator in vim with conceal plugin. This is an example given in PEP 572. It looks perfectly fine. It also does not require ANY change to Python-the-language. It just means that I can type ':' followed by '=' to get that, rather than type 'Alt+Shift', '2', '1', '9', '0'. So fewer keystrokes. No chording. Easier to type. And what gets saved to disk is good old plain ASCII. I don't hate how it looks, but I really, really don't get how it's supposed to "transform my thinking about coding" to have a slightly different glyph on screen. I mean, as shown in this example and a previous one I posted a screenshot of, I think it's cute and geeky to use a few math symbols in the same way in my editor. I've been doing that for a few years, and it never got beyond "slightly cute." https://www.dropbox.com/s/vtwd1grlnml8sz2/Screenshot%202019-11-11%2019.02.58... On Mon, Nov 11, 2019 at 5:23 PM Mike Miller <python-ideas@mgmiller.net> wrote:
On 2019-11-10 12:50, Martin Euredjian via Python-ideas wrote:
I have found that trying to explain the value of true notation to people who lack the experience and training is always a losing proposition. I'm already regretting having started this thread, simply because I know how this works.
Reminds me of the "you can't tell people anything," post:
http://habitatchronicles.com/2004/04/you-cant-tell-people-anything/
"What’s going on is that without some kind of direct experience to use as a touchstone, people don’t have the context that gives them a place in their minds to put the things you are telling them."
I found the thread interesting despite the many "how to type it?" replies. Don't be too discouraged.
-Mike _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/46DS43... Code of Conduct: http://python.org/psf/codeofconduct/
-- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th.
On Mon, Nov 11, 2019 at 4:16 PM David Mertz <mertz@gnosis.cx> wrote:
I really, really don't get how it's supposed to "transform my thinking about coding" to have a slightly different glyph on screen.
I agree here. This thread got really caught up in issues like "how do I type that?", but I don't think that was the OP's point. He was arguing for "Notations" -- but I, at least have no idea what that means. He made two specific proposals in the OP: 1) Use a single "left arrow" symbol in place of the two ascii-char := for the assignment expression operator. 2) phase out the regular old assignment expression altogether eventually. These are quite independent really. But that was a lot of work, and a big old thread if the point was simply to use one non-ascii symbol for one operator -- that clearly won't be "transformative". So I *think* the real point to "notations" is really to have a lot more operators - each with its own meaning -- *maybe* in place of some operator overloading. See the other recent thread -- if you want a operator that means "merge these two dicts", use a new one, rather than trying to reuse + -- which has some merit. After all, a number of the objections to the dict addition proposal was that dict merging is pretty different than numerical addition, and thus people might get confused. If you were to introduce a lot more operators, you would get more compact code, and you probably would want to use a wider range of symbols than ascii provides, so as to avoid the "ascii soup" to. referred to. And I think there is some merit to the "more operators" -- that's exactly why @ was added -- folks doing matrix calculations really wanted to be able to write: A @ B (or A* B) rather than: np.dot(A, B) Particularly why we have all the other operators, rather than everything being a function call. Would we really want: index(a_list, i) or slice(a_sequence, i, j, step) Making the code look like the math on a blackboard has its advantages. However, as pointed out in this thread, even in math notation, the same (or similar) notation has different meaning in different contexts, making it very hard to take that approach is a general purpose programing language. So there is a limit -- making everything an operator would be worse than ascii soup, - it would be hieroglyphics to most of us. like complex math is to people outside the domain it's used in. I think we need a mixture of operators and named functions, and that Python as the balance about right as it is. The other issue at hand is overloading vs new operators -- and particularly in a dynamic language, there's something to be said for more operators rather than overloading -- but i'm really not sure about that -- more than a handful more, and I think I'd get very confused --even if I could figure out how to type them. -CHB -- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython
Hi,
I mean, as shown in this example and a previous one I posted a screenshot of, I think it's cute and geeky to use a few math symbols in the same way in my editor. I've been doing that for a few years, and it never got beyond "slightly cute."
I would second this. I find it actually less readable if the font does not provide nice arrows. It reminds me of ScaLa and the "=>" symbol. The right implication arrow was barely readable in most common Ubuntu fonts. Cheers, Marko
On 2019-11-11 16:13, David Mertz wrote:
I implemented this discussed arrow operator in vim with conceal plugin. This is an example given in PEP 572. It looks perfectly fine. It also does not require ANY change to Python-the-language. It just means that I can type ':' followed by '=' to get that, rather than type 'Alt+Shift', '2', '1', '9', '0'. So fewer keystrokes. No chording. Easier to type. And what gets saved to disk is good old plain ASCII.
I like your solution and think it looks great, though perhaps you forgot the space behind it? I'm not a huge fan of how modern Python is putting colons everywhere so this helps a tiny bit.
I don't hate how it looks, but I really, really don't get how it's supposed to "transform my thinking about coding" to have a slightly different glyph on screen.
Probably would need several, as CB mentioned below. Still, debatable.
I mean, as shown in this example and a previous one I posted a screenshot of, I think it's cute and geeky to use a few math symbols in the same way in my editor. I've been doing that for a few years, and it never got beyond "slightly cute."
Guessing there were a few rare curmudgeons who didn't think we needed lowercase letters before ascii and still a few who don't want syntax highlighting either. I realize we're hitting the land of diminishing returns on text, but once features are gained I know I don't want to go back. For example, I use many useful Unicode symbols in my text strings and console output. Billions of folks are using non-latin alphabets right now because Python3 makes it easy. All modern systems can handle them, why not? And input is not an significant issue, though it depends on the block.
Yeah. Maybe I should replace regex ' *:=' rather than just ':='. That's easy enough with the plugin On Tue, Nov 12, 2019, 12:12 PM Mike Miller <python-ideas@mgmiller.net> wrote:
On 2019-11-11 16:13, David Mertz wrote:
I implemented this discussed arrow operator in vim with conceal plugin. This is an example given in PEP 572. It looks perfectly fine. It also does not require ANY change to Python-the-language. It just means that I can type ':' followed by '=' to get that, rather than type 'Alt+Shift', '2', '1', '9', '0'. So fewer keystrokes. No chording. Easier to type. And what gets saved to disk is good old plain ASCII.
I like your solution and think it looks great, though perhaps you forgot the space behind it? I'm not a huge fan of how modern Python is putting colons everywhere so this helps a tiny bit.
I don't hate how it looks, but I really, really don't get how it's supposed to "transform my thinking about coding" to have a slightly different glyph on screen.
Probably would need several, as CB mentioned below. Still, debatable.
I mean, as shown in this example and a previous one I posted a screenshot of, I think it's cute and geeky to use a few math symbols in the same way in my editor. I've been doing that for a few years, and it never got beyond "slightly cute."
Guessing there were a few rare curmudgeons who didn't think we needed lowercase letters before ascii and still a few who don't want syntax highlighting either. I realize we're hitting the land of diminishing returns on text, but once features are gained I know I don't want to go back.
For example, I use many useful Unicode symbols in my text strings and console output. Billions of folks are using non-latin alphabets right now because Python3 makes it easy. All modern systems can handle them, why not? And input is not an significant issue, though it depends on the block. _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/4WNTP4... Code of Conduct: http://python.org/psf/codeofconduct/
participants (19)
-
Abe Dillon
-
Anders Hovmöller
-
Andrew Barnert
-
C. Titus Brown
-
Chris Angelico
-
Christopher Barker
-
David Mertz
-
Greg Ewing
-
Marko Ristin-Kaufmann
-
Martin Euredjian
-
Mike Miller
-
MRAB
-
Paul Moore
-
Random832
-
Rhodri James
-
Richard Damon
-
Ricky Teachey
-
Rob Cliffe
-
Stephen J. Turnbull