Re: [Python-Dev] Adventures with Decimal

Tim Peters writes:
Other important implementations of the standard didn't make this mistake; for example, Java's BigDecimal|(java.lang.String) constructor follows the rules here: [...] Hmm -- or maybe it doesn't! The text says: [...]
Here is the actual behavior: Jython 2.1 on java1.5.0_03 (JIT: null) Type "copyright", "credits" or "license" for more information.
import java.math.BigDecimal as BigDecimal import java.math.MathContext as MathContext BigDecimal("0.33333333333333333333333333333333333") 0.33333333333333333333333333333333333 BigDecimal("0.33333333333333333333333333333333333", MathContext.DECIMAL32) 0.3333333
In other words, Java's behavior is much closer to the current behavior of Python, at least in terms of features that are user-visible. The default behavior in Java is to have infinite precision unless a context is supplied that says otherwise. So the constructor that takes a string converts it faithfully, while the constructor that takes a context obeys the context. One could argue that they are "following the rules" appropriately and it just happens that their default context has infinite precision. But from the point of view of the typical, non-floating-point-aware user, Java's constructor gives "all the digits you told it to", and so does Python's current string constructor. Using "+decimal.Decimal(s)" today "rounds off the constant" (in the nieve user's viewpoint). Of course, the user is going to be surprised in the NEXT step since the Python *operations* respect context while the Java ones use infinite precision for +, -, and * (and require you to specify the behavior for /). (PS: No, I don't think we should design decimal.Decimal to match the behavior of Java... but I don't think that the Java example really helps make your point.) Elsewhere, Tim writes:
Sorry, I can't make more time for this now.
I understand!
The short course is that a module purporting to implement an external standard should not deviate from that standard without very good reasons
Yes, but should we think of the constructor-from-string as an implementation-specific means of creating Decimal objects, which is separate from string-to-Decimal converter that the standard requires (and which is provided by the Context objects)? -- Michael Chermside

On 5/20/05, Michael Chermside <mcherm@mcherm.com> wrote:
In other words, Java's behavior is much closer to the current behavior of Python, at least in terms of features that are user-visible. The default behavior in Java is to have infinite precision unless a context is supplied that says otherwise. So the constructor that takes a string converts it faithfully, while the constructor that takes a context obeys the context.
Are we hitting that point where the most important players (Python and Java, ;) implement the standard almost fully compliant, and then the standard revises *that* behaviour? For the record, I'm -0 for changing the actual behaviour: I'd really like to implement exactly the Spec, but I think it's more important the practical reasons we have to don't do it. . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/

It looks like if you pass in a context, the Decimal constructor still ignores that context:
import decimal as d d.getcontext().prec = 4 d.Decimal("1.2345678901234567890123456789012345678901234567890000", d.getcontext()) Decimal("1.2345678901234567890123456789012345678901234567890000")
I think this is contrary to what some here have claimed (that you could pass an explicit context to cause it to round according to the context's precision). -- --Guido van Rossum (home page: http://www.python.org/~guido/)

[Guido]
It looks like if you pass in a context, the Decimal constructor still ignores that context:
import decimal as d d.getcontext().prec = 4 d.Decimal("1.2345678901234567890123456789012345678901234567890000", d.getcontext()) Decimal("1.2345678901234567890123456789012345678901234567890000")
I think this is contrary to what some here have claimed (that you could pass an explicit context to cause it to round according to the context's precision).
I think Michael Chermside said that's how a particular Java implementation works. Python's Decimal constructor accepts a context argument, but the only use made of it is to possibly signal a ConversionSyntax condition.

[Guido]
It looks like if you pass in a context, the Decimal constructor still ignores that context:
import decimal as d d.getcontext().prec = 4 d.Decimal("1.2345678901234567890123456789012345678901234567890000", d.getcontext()) Decimal("1.2345678901234567890123456789012345678901234567890000")
I think this is contrary to what some here have claimed (that you could pass an explicit context to cause it to round according to the context's precision).
[Tim]
I think Michael Chermside said that's how a particular Java implementation works.
Python's Decimal constructor accepts a context argument, but the only use made of it is to possibly signal a ConversionSyntax condition.
You know that, but Raymond seems confused. From one of his posts (point (k)): "Throughout the implementation, the code calls the Decimal constructor to create intermediate values. Every one of those calls would need to be changed to specify a context." But passing a context doesn't help for obtaining the desired precision. PS I also asked Cowlishaw and he said he would ponder it over the weekend. Maybe Raymond can mail him too. ;-) -- --Guido van Rossum (home page: http://www.python.org/~guido/)

On 5/20/05, Guido van Rossum <gvanrossum@gmail.com> wrote:
Python's Decimal constructor accepts a context argument, but the only use made of it is to possibly signal a ConversionSyntax condition.
You know that, but Raymond seems confused. From one of his posts (point (k)):
"Throughout the implementation, the code calls the Decimal constructor to create intermediate values. Every one of those calls would need to be changed to specify a context."
The point here, I think, is that intermediate Decimal objects are created, and the whole module assumes that the context does not affect that intermediate values. If you change this and start using the context at Decimal creation time, you'll have to be aware of that in a lot of parts of the code. OTOH, you can change that and run the test cases, and see how bad it explodes (or not, ;). . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/

[Guido]
You know that, but Raymond seems confused. From one of his posts (point (k)):
[Raymond]
"Throughout the implementation, the code calls the Decimal constructor to create intermediate values. Every one of those calls would need to be changed to specify a context."
[Facundo]
The point here, I think, is that intermediate Decimal objects are created, and the whole module assumes that the context does not affect that intermediate values. If you change this and start using the context at Decimal creation time, you'll have to be aware of that in a lot of parts of the code.
OTOH, you can change that and run the test cases, and see how bad it explodes (or not, ;).
Bingo! That is point (k) from the big missive. Raymond

It looks like if you pass in a context, the Decimal constructor still ignores that context:
import decimal as d d.getcontext().prec = 4
d.Decimal("1.2345678901234567890123456789012345678901234567890000", d.getcontext()) Decimal("1.2345678901234567890123456789012345678901234567890000")
I think this is contrary to what some here have claimed (that you could pass an explicit context to cause it to round according to the context's precision).
That's not the way it is done. The context passed to the Decimal constructor is *only* used to determine what to do with a malformed string (whether to raise an exception or set a flag. To create a decimal with a context, use the Context.create_decimal() method:
import decimal as d d.getcontext().prec = 4
d.getcontext().create_decimal("1.234567890123456789012345678901234567890 1234567890000") Decimal("1.235")
Raymond
participants (5)
-
Facundo Batista
-
Guido van Rossum
-
Michael Chermside
-
Raymond Hettinger
-
Tim Peters