[Python-ideas] Decimal literal?

Chris Rebert clp at rebertia.com
Thu Dec 4 11:10:33 CET 2008


On Thu, Dec 4, 2008 at 1:56 AM, Raymond Hettinger <python at rcn.com> wrote:
> From: "Cesare Di Mauro" <cesare.dimauro at a-tono.com>
>>
>> But at least it will be more usable to have a short-hand for decimal
>> declaration:
>>
>> a = 1234.5678d
>
> How often do you put non-integer constants in real programs?
> Don't you find that most real decimal apps start with external
> data sources instead of all the data values being hard-coded
> in your program?

In all fairness, by that same argument we shouldn't have float
literals, yet we do despite that. They're useful in scripts where
things are hardcoded. Later, the scripts grow and we do end up reading
the numbers in from external sources. That doesn't mean the initial
script version wasn't useful. Literals help when writing
proofs-of-concept and rapid prototypes, areas where Python has
historically done well.
Java's designers probably used similar arguments against hard-coding
when deciding not to include collection literals; meanwhile Python
does have such literals and they appear to be much cherished as
language features go. The parallels to the decimal situation are
striking.
Having decimal literals as well would at least keep things consistent.
Sets are less common, yet they now have literals; why not decimals
too?

Cheers,
Chris

-- 
Follow the path of the Iguana...
http://rebertia.com

> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



More information about the Python-ideas mailing list