Stephen J. Turnbull wrote:
Richard Musil writes:
After some thinking it seems that the float is only case where this "loss of a precision" happen.
This is true of Unicode normalized forms as well as floats.
But it's true in Python, where you have the option to not normalise your deserialised JSON strings. What's missing is the ability to deserialise JSON floats to a non-lossy type.
This seems like a reasonable thing to want, even if you're not intending to round-trip anything.