lars van gemerden
lars at rational-it.com
Fri Jul 6 13:22:18 CEST 2012
On Sunday, July 1, 2012 5:48:40 PM UTC+2, Evan Driscoll wrote:
> On 7/1/2012 4:54, Alister wrote:
> > On Sat, 30 Jun 2012 23:45:25 -0500, Evan Driscoll wrote:
> >> If I had seen that in a program, I'd have assumed it was a bug.
> > You would?
> > I have only been using python for 6 - 12 months but in my past I
> > programmed microcontrollers in assembly.
> > as soon as i saw it i understood it & thought great, like a light bulb
> > going on.
> It's not a matter of "I wouldn't have understood what the author was
> trying to do" (with a small caveat), it's a matter of "I'd have assumed
> that the author didn't understand the language semantics."
> I'm pretty sure it goes contrary to *every other programming language
> I've ever used* (and a couple that I haven't).
> I understand why Python does it, and it certainly is nice in that it
> matches typical mathematical notation, but the surprise quotient is
> *very* high in the PL world IMO.
Avoiding suprises would mean we cannot improve languages, just reshuffle features?
More information about the Python-list