[Python-ideas] Decimal literal?

Cesare Di Mauro cesare.dimauro at a-tono.com
Thu Dec 4 10:45:41 CET 2008


On 04 dicembre 2008 alle ore 10:37 AM, Adam Olsen <rhamph at gmail.com> wrote:

> Intuitively, you'd think it's more correct, but for non-trivial usage
> I see no reason for it to be.  The strongest arguments on [1] seem to
> be controllable precision and stricter standards.  Controllable
> precision works just as well in a library.  Stricter standards (ie
> very portable semantics) could be done with base-2 floats via software
> emulating on all platforms (and throwing performance out the window).
>
> Do you have some use cases that are (completely!) correct in decimal,
> and not in base-2 floating point?  Something not trivial (emulating a
> schoolbook, writing a calculator, etc.)
>
> I see Decimal as a modest investment for a mild return.  Not worth the
> effort to switch.

But at least it will be more usable to have a short-hand for decimal
declaration:

a = 1234.5678d

is simplier than:

import decimal
a = decimal.Decimal('1234.5678')

or:

from decimal import Decimal
a = Decimal('1234.5678')

Cheers
Cesare



More information about the Python-ideas mailing list