[Python-ideas] Decimal literal?

Leif Walsh leif.walsh at gmail.com
Fri Dec 5 05:41:35 CET 2008


On Thu, Dec 4, 2008 at 5:09 PM, Terry Reedy <tjreedy at udel.edu> wrote:
> We have one by many definitions of 'accurate'.  Being off by a few or even a
> hundred parts per quintillion is pretty good by some standards.

I agree.  That's why I don't think the decimal module should be the
"default implementation".

> I disagree.

Okay.  "Perfectly accurate" then.

> The notion that decimal is more 'accurate' than float needs a lot of
> qualification.  Yes, it is intended to give *exactly* the answer to various
> financial calculations that various jurisdictions mandate, but that is a
> rather specialized meaning of 'accurate'.

You've said what I mean better than I could.  The float implementation
is more than good enough for almost all applications, and it seems
ridiculous to me to slow them down for the precious few that need more
precision (and, at that, just don't want to type quite as much).

-- 
Cheers,
Leif



More information about the Python-ideas mailing list