[Python-Dev] Decimal data type issues
t-meyer at ihug.co.nz
Tue Apr 13 18:50:19 EDT 2004
> c. ``to-number``: This operation converts a string to a
> number, as defined by its abstract representation.
> c. from_string
> Regarding the method (c), the only difference with creating
> the decimal with Decimal(string) is that method (c) honors
> the context (if the literal contains more digits that the
> current precision the numbers get rounded, and gets rounded
> according to the round method specified in context, etc).
> For example, with a precision of 9 and with the name I proposed::
> >>> Decimal('112233445566')
> Decimal( (0, (1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6), 0) )
> >>> Decimal.from_string('112233445566')
> Decimal( (0, (1, 1, 2, 2, 3, 3, 4, 4, 6), 3L) )
As a relative newbie, I think that it would be better if (c) had a name that
somehow indicated the difference; however, I can't think of one offhand that
does :) It seems to me that it would be easy to assume that the two above
cases were the same (and given that they produce the same result in some
situations, might be missed by weak testing). Certainly a quick look at
documentation would correct that, but if the name could also reflect the
difference, that would be better.
Even if from_string took a parameter "honour_context", defaulting to True,
that would, IMO, be more clear that there's a difference between the two,
and what it is.
More information about the Python-Dev