# [Python-ideas] Python-ideas Digest, Vol 103, Issue 29

Stephen J. Turnbull stephen at xemacs.org
Thu Jun 4 10:33:16 CEST 2015

```u8y7541 The Awesome Person writes:

> Yeah, either Decimal becomes default or 13/10 is a fraction. If
> Decimal becomes default, we could have Decimal(13 / 10) =
> Decimal(13) / Decimal(10). We would have "expected" results.

I gather you haven't read anybody's replies, because, no, you don't
get expected results with Decimal: Decimal can still violate all the
invariants that binary floats can.  Binary has better approximation
properties if you care about the *degree* of inexactness rather than
the *frequency* of inexactness.  Fractions can quickly become very
inefficient.  Of course you could try approximate fractions with fixed
slash or floating slash calculations to get bounds on the
complexity of "simple" arithmetic, but then you're back in a world
with approximations.

Footnotes:
  In some sense.  One important sense is "how often humans would
care or even notice", which is likely to be even less frequent than
how often inexactness is introduced, for Decimal.  But that varies by
human.

  It's in Knuth, Seminumerical Algorithms IIRC.

```