Why isn't Python king of the hill?

Alex Martelli aleaxit at yahoo.com
Sat Jun 2 13:50:43 EDT 2001


"E. Mark Ping" <emarkp at CSUA.Berkeley.EDU> wrote in message
news:9fb5fl$26kf$1 at agate.berkeley.edu...
    ...
> Really, floating point arithmetic has well-defined semantics; guessing
> and treating them as if they have random components is the lazy and
> error-prone way to use them.

It's ONE lazy and error-prone way; another one is not keeping
the possible errors and approximations always in one's mind
(and there are others yet, but those are the main two, I think).

I think they're widespread.  I've been guilty of both (and yet
others besides) at various points in my life and career, and I do
not personally know anybody (who has done significant numerical
programming) who can claim he has never been guilty of either
while looking at me eye to eye and keeping a straight face (there
must be such people somewhere, I'm sure -- just not in my circle
of real life acquaintances).


Alex






More information about the Python-list mailing list