[Tutor] Decimals 'not equal to themselves' (e.g. 0.2 equals 0.200000001)
Kent Johnson
kent37 at tds.net
Sun Aug 3 17:35:49 CEST 2008
On Sun, Aug 3, 2008 at 10:04 AM, CNiall <cniall at icedcerulean.com> wrote:
> I want to make a simple script that calculates the n-th root of a given
> number (e.g. 4th root of 625--obviously five, but it's just an example :P),
> and because there is no nth-root function in Python I will do this with
> something like x**(1/n).
>
> However, with some, but not all, decimals, they do not seem to 'equal
> themselves'. This is probably a bad way of expressing what I mean, so I'll
> give an example:
> 0.125
>>>> 0.2
> 0.20000000000000001
>>>> 0.33
> 0.33000000000000002
>
> As you can see, the last two decimals are very slightly inaccurate. However,
> it appears that when n in 1/n is a power of two, the decimal does not get
> 'thrown off'. How might I make Python recognise 0.2 as 0.2 and not
> 0.20000000000000001?
This is a limitation of floaating point numbers. A discussion is here:
http://docs.python.org/tut/node16.html
Your root calculator can only find answers that are as accurate as the
representation allows.
Kent
More information about the Tutor
mailing list