
Raymond Hettinger wrote:
IMO, user input (or the full numeric strings in a text data file) is sacred and presumably done for a reason -- the explicitly requested digits should not be throw-away without good reason.
I still don't understand what's so special about the input phase that it should be treated sacredly, while happily desecrating the result of any *other* operation.
The 'difference' here is, with unlimited precision decimal representations, there is no "input phase". The decimal number can represent the value, sign, and exponent in the character string the user provided _exactly_, and indeed it could be implemented using strings as the internal representation -- in which case the 'construction' of a new number is simply a string copy operation. There is no operation taking place as there is no narrowing necessary. This is quite unlike (for example) converting an ASCII string "1.01" to a binary floating-point double which has a fixed precision and no base-5 component. mfc