[Python-ideas] make decimal the default non-integer instead of float?

Guido van Rossum guido at python.org
Sat Sep 29 23:06:44 CEST 2012


On Sat, Sep 29, 2012 at 1:34 PM, Calvin Spealman <ironfroggy at gmail.com> wrote:
> I like the idea a lot, but I recognize it will get a lot pushback. I
> think learning Integer -> Decimal -> Float is a lot more natural than
> learning Integer -> Float -> Decimal. The Float type represents a
> specific hardware accelleration with data-loss tradeoffs, and the use
> should be explicit. I think that as someone learns, the limitations of
> Decimals will make a lot more sense than those of Floats.

Hm. Remember decimals have data loss too: they can't represent 1/3 any
more accurately than floats can, and like floats they are limited to a
certain number of digits after which they begin dropping precision
even if the result *can* be represented exactly. It's just that they
can represent 1/5 exactly, which happens to be culturally important to
humans, and that the number of digits at which loss of precision
happens is configurable. (And the API to configure it may actually
make it more complex to learn.)

-- 
--Guido van Rossum (python.org/~guido)



More information about the Python-ideas mailing list