On Monday, March 3, 2014 2:44:12 PM UTC-6, David Mertz wrote: However, I haven't seen a case presented for why decimals are generically
better as a default.
hi David, here is one, right out of the python3.3 docs,
"Decimal “is based on a floating-point model which was designed with people in mind, and necessarily has a paramount guiding principle – computers must provide an arithmetic that works in the same way as the arithmetic that people learn at school.” – excerpt from the decimal arithmetic specification. Business apps require precision (banking, sales, marketing, finance, & on and on). One big issue that is going to confront everyone sooner than later is cryptography. Fast bignum support, fast factoring, and fast transcendentals are going to become more important as firms and individuals move into their own on crypto; not too far fetched, really. We have got to come up with this on our own, because we can not trust others to get it right for us. NSA and GCHQ have made that clear. PGP was one thing, but we have got to invent something better. But, really the real reason is the first paragraph. marcus