
On Sun, Jul 22, 2018 at 11:35 PM, Giampaolo Rodola' <g.rodola@gmail.com> wrote:
On Sun, Jul 22, 2018 at 2:10 PM Steven D'Aprano <steve@pearwood.info> wrote:
On Sun, Jul 22, 2018 at 12:13:04PM +0200, Giampaolo Rodola' wrote:
On Sun, Jul 22, 2018 at 3:55 AM Steven D'Aprano <steve@pearwood.info> wrote: [...]
I don't think that "+" is harder to read than "standard_mathematics_operators_numeric_addition"
Please let's drop the argument that + - * / = and ? are the same. [...] But if we insist that every symbol we use is instantly recognisable and intuitively obvious to every programmer, we're putting the bar for acceptance impossibly high.
I personally don't find "a ?? b" too bad (let's say I'm -0 about it) but idioms such as "a?.b", "a ??= b" and "a?[3] ?? 4" look too Perl-ish to me, non pythonic and overall not explicit, no matter what the chosen symbol is gonna be.
Please explain what is not explicit about it. "a?.b" is very simple and perfectly explicit: it means "None if a is None else a.b". What does "not explicit" mean, other than "I don't like this code"? ChrisA