On 10 March 2014 15:31, Steven D'Aprano
On Mon, Mar 10, 2014 at 02:05:22PM +0000, Oscar Benjamin wrote:
This is what I've been thinking about. Most non-expert users will be very happy with Decimal64 and a single fixed context. There could be a separate type called decimal64 in builtins that always used the same standard context. Literals could create decimal64 instances.
Where this would get complicated is for people who also use the Decimal type. They'd need to keep track of which objects were of which type and so decimal literals might seem more annoying than useful.
Hmmm. I don't think this should necessarily be complicated, or at least no more complicated than dealing with any other mixed numeric types. If you don't want to mix them, don't mix them :-)
Exactly. It just means that you wouldn't want to use the literals in code that actually uses the decimal module. Consider: if some_condition: x = 1d else: x = Decimal(some_string) # ... y = x / 3 So now x / 3 rounds differently depending on whether x is a decimal64 or a Decimal. I probably don't want that. The solution: coerce to Decimal. But then why did I bother with the Decimal literal anyway? Decimal(1d) is hardly better than Decimal('1'). Oscar