[Python-ideas] Decimal literal?
python at rcn.com
Thu Dec 4 10:51:08 CET 2008
If decimals are to become built-in, there are a number of things that need to happen and one of them includes a C implementation,
not just for speed, but also to integrate with the parser and the rest of the language.
Last time I looked, the existing C implementations out there were license compatible with Python. Also, there are other integration
issues to solved, including that of contexts (which are an integral part of the spec). None of this is a trivial exercise or I
would have already done it. I do want to move decimal towards being a builtin but don't underestimate the difficulty of doing so.
Also, there are other API issues. As it stands, the decimal module is not friendly to newbies and presents challenges even for
expert users. And don't underestimate the significance of performance -- it is a top reason that people currently avoid the decimal
module and it is an issue for the language itself (lots of companies avoid Python because of its speed disadvantage).
One other thought, decimal literals are likely not very helpful in real programs. Most apps that have specific numeric
requirements, will have code that manipulates numbers read-in from external sources and written back out -- the scripts themselves
typically contain very few constants (and those are typically integers), so you don't get much help from a decimal literal.
----- Original Message -----
From: "Chris Rebert" <clp at rebertia.com>
To: "Python-Ideas" <python-ideas at python.org>
Sent: Wednesday, December 03, 2008 11:51 PM
Subject: [Python-ideas] Decimal literal?
> With Python 3.0 being released, and going over its many changes, I was
> reminded that decimal numbers (decimal.Decimal) are still relegated to
> a library and aren't built-in.
> Has there been any thought to adding decimal literals and making
> decimal a built-in type? I googled but was unable to locate any
> discussion of the exact issue. The closest I could find was a
> suggestion about making decimal the default instead of float:
> It seems that decimal arithmetic is more intuitively correct that
> plain floating point and floating point's main (only?) advantage is
> speed, but it seems like premature optimization to favor speed over
> correctness by default at the language level.
> Obviously, making decimal the default instead of float would be
> fraught with backward compatibility problems and thus is not presently
> feasible, but at the least for now Python could make it easier to use
> decimals and their associated nice arithmetic by having a literal
> syntax for them and making them built-in.
> So what do people think of:
> 1. making decimal.Decimal a built-in type, named "decimal" (or "dec"
> if that's too long?)
> 2. adding a literal syntax for decimals; I'd naively suggest a 'd'
> suffix to the float literal syntax (which was suggested in the brief
> aforementioned thread)
> 3. (in Python 4.0/Python 4000) making decimal the default instead of
> float, with floats instead requiring a 'f' suffix
> Obviously #1 & #2 would be shooting for Python 3.1 or later.
> P.S. Yay for the long-awaited release of Python 3.0! Better than can
> be said for Perl 6.
> Follow the path of the Iguana...
> Python-ideas mailing list
> Python-ideas at python.org
More information about the Python-ideas