Steven D'Aprano writes:
And as for programmers... the popularity of one-liners, the obfuscated C competition, code golf, "clever coding tricks" etc is rarely for the purposes of communication *about code*.
Sure, but *Python* is popular because it's easy to communicate *with* and (usually) *about* Python code, and it does pretty well on "terse" for many algorithmic idioms. (Yes, there are other reasons -- reasonable performance, batteries included, etc. That doesn't make the design of the language *not* a reason for its popularity.) You seem to be understanding my statements to be much more general than they are. I'm only suggesting that this applies to Python as we know and love it, and to Pythonic tradition.
The major objection is that I think its still too hard to expect the average programmer to be able to produce the λ symbol on demand. We don't all have a Greek keyboard :-)
So what? If you run Mac OS X, Windows, or X11, you do have a keyboard capable of producing Greek. And the same chords work in any Unicode- capable editor, it's just that the Greek letters aren't printed on the keycaps. Neither are emoticons, nor the CUA gestures (bucky-X[1], bucky-C, bucky-V, and the oh-so-useful bucky-Z) but those are everywhere. Any 10-year-old can find them somehow! To the extent that Python would consider such changes (ie, a half-dozen or so one-character replacements for multicharacter operators or keywords), it would be very nearly as learnable to type them as to read them. The problem (if it exists, of course -- obviously, I believe it does but YMMV) is all about overloading people's ability to perceive the meaning of code without reading it token by token. Footnotes: [1] Bucky = Control, Alt, Meta, Command, Option, Windows, etc. keys.