"no variable or argument declarations are necessary."
chris.cavalaria at free.fr
Mon Oct 3 17:42:02 CEST 2005
Steven D'Aprano a écrit :
> On Mon, 03 Oct 2005 06:59:04 +0000, Antoon Pardon wrote:
>>Well I'm a bit getting sick of those references to standard idioms.
>>There are moments those standard idioms don't work, while the
>>gist of the OP's remark still stands like:
>> egold = 0:
>> while egold < 10:
>> if test():
>> ego1d = egold + 1
> for item in [x for x in xrange(10) if test()]:
> But it isn't about the idioms. It is about the trade-offs. Python allows
> you to do things that you can't do in other languages because you
> have much more flexibility than is possible with languages that
> require you to declare variables before using them. The cost is, some
> tiny subset of possible errors will not be caught by the compiler. But
> since the compiler can't catch all errors anyway, you need to test for
> errors and not rely on the compiler. No compiler will catch this error:
> x = 12.0 # feet
> # three pages of code
> y = 15.0 # metres
> # three more pages of code
> distance = x + y
> if distance < 27:
> And lo, one multi-billion dollar Mars lander starts braking either too
> early or too late. Result: a new crater on Mars, named after the NASA
> employee who thought the compiler would catch errors.
> Declared variables have considerable labour costs, and only marginal
> gains. Since the steps you take to protect against other errors will also
> protect against mistyping variables, declarations of variables is of
> little practical benefit.
As a matter of fact, doing that one on a HP48 calculator with unit
anotated values would have worked perfectly, except for the distance <
27 check which would have raised one error.
More information about the Python-list