I don't know if this is the good place for this metaphysic question...But, I read python docs in each version, but can't find the real answer. 2 days I'm on it :(
Here is the question:
In python docs 3.4 it is written:
"Historically, the Python prompt and built-in repr() function would choose
the one with 17 significant digits, 0.10000000000000001. Starting with
Python 3.1, Python (on most systems) is now able to choose the shortest of
these and simply display 0.1."
The question is HOW python choose it?
Here are examples I put in the shell:
No problem here...
The problem is NOT the number 4 that appears at the end. I know it deals with IEEE754 representation and that python displays the 17 first digits as tells in the doc.
The problem is why it appears and why not for 0.2....
Let's see why it is a problem for me:
the first 17 digits for 0.1 python's display are:
so ok for 0.1 as display
the first 17 digits for 0.1+0.2 python's display are:
so, ok for the display.
the first 17 digits for 0.2 python's display are:
so, why just only 0.2 displayed?
and the same for the 17 first digits for 0.3:
so, why python only displayed 0.3?
How python choose the correct display please?
Thank you so much!!!!!!!!!!