Raymond Hettinger firstname.lastname@example.org wrote:
Hogwash. The only issues with decimal are ease-of-use and speed.<br>
I suggest that you get hold of a good 1960s or 1970s book on computer arithmetic, and read up about "wobbling precision". While it is not a big deal, it was regarded as such, and is important enough to cause significant numerical problems to the unwary - which means 99.99% of modern programmers :-(
And, as I am sure that Aahz could point out, there are significant reliability issues concerned with frequent base changes where any loss of precision is unacceptable. Yes, it can always be done, but only a few people are likely to do it correctly in all cases.
Regards, Nick Maclaren, University of Cambridge Computing Service, New Museums Site, Pembroke Street, Cambridge CB2 3QH, England. Email: email@example.com Tel.: +44 1223 334761 Fax: +44 1223 334679