[Python-ideas] Decimal literal?

Stephen J. Turnbull stephen at xemacs.org
Thu Dec 4 11:43:43 CET 2008


Chris Rebert writes:

 > We're talking about adding a feature, not taking speed away.

OK, that's reasonable.  But adding features is expensive.  BTW, don't
listen to me, I've never done it.  Listen to Raymond.

 > If anything, this would increase adoption of Python as people
 > writing programs that use decimals extensively would be able to use
 > decimals with greater ease.

Maybe.  I don't see a huge advantage of



over

import Decimal

I also think that most of the (easy) advantage to Decimal will accrue
to people who *never* have to deal with measurement error:
accountants.  But oops! they don't need Decimal per se; they're
perfectly happy with big integers.  People who really *do* need
Decimal are not going to be deterred by 16 characters (counting the
newline<wink>); they're already into real pain.

 > Additionally, your argument can be turned on its head ;-) Consider:
 > Does perfect accuracy matter quite *that* critically in most
 > everyday programs?  Of course not.  But that's the wrong question.
 > Python is a *general-purpose* programming language, not an
 > "everyday application where accuracy isn't critical programming
 > language".  There are plenty of applications that just cry
 > out<wink> for a Python implementation where it does matter.

I think you've misspelled "precision".<wink>  Improved accuracy cannot
be achieved simply by adding a new number type.




More information about the Python-ideas mailing list