[Python-ideas] Python Numbers as Human Concept Decimal System
Oscar Benjamin
oscar.j.benjamin at gmail.com
Mon Mar 10 16:54:16 CET 2014
On 10 March 2014 15:21, Steven D'Aprano <steve at pearwood.info> wrote:
> On Mon, Mar 10, 2014 at 02:53:42PM +0100, Stefan Krah wrote:
>
>> That is why I think we should seriously consider moving to IEEE semantics
>> for a decimal literal. Among other things:
>
> Okay, I'm confused. I thought the decimal module had IEEE-754 semantics,
> and so I assumed that so would any decimal literal. What have I missed?
IEEE 754 (2008) defines fixed width decimal types decimal64 and
decimal128. It also refers to "extended and extendable" with
"extendable" meaning something like the current Decimal type.
I think that Stefan is proposing that decimal literals should be
parsed as if the result needed to be stored in a fixed width decimal64
type. The standard defines all the "Details of conversion between
floating-point data and external character sequences" which would
regulate how this could be done.
Just parsing the number as decimal64 and using a decimal64 type
context by default is not enough to prevent the IMO confusing action
at a distance effect of Decimal contexts though. It also wouldn't
solve the issue around negative literals if a context with less
precision or exponent range than decimal64 was used.
If there were a separate type whose context always corresponded to
decimal64 then unary + and - on a literal would have the meaning that
people would expect and it would be possible to determine exactly what
the code would do without needing to place caveats on what the decimal
context is. I think that this would be the best approach if we're
aiming at non-expert users.
Oscar
More information about the Python-ideas
mailing list