There is no "normalized" representation for JSON. If you look at the "standard" it is pretty simple (json.org). The JSON object is defined solely by its textual representation (string of characters).
The way how different parsers choose to represent it in the binary form, so they can process it, is an implementation detail, but JSON format neither stipulate any particular (binary) representation or that all have to be same.
From the JSON point of view 0.6441726684570313 is perfectly valid float (or better say _number_ as it is what JSON uses) and 0.6441726684570312 is perfectly valid _and different_ number, because it differs in the last digit.
The fact that both numbers transform into the same value when represented in IEEE-754 floating point number format is the feature of this particular binary representation, and has nothing to do with the JSON itself. The underlying JSON parser may as well choose a different representation and preserve an arbitrary precision (for example decimal.Decimal).
From the JSON point of view there is no ambiguity, nor doubt. Even the number 0.64417266845703130 (note the last 0) is different JSON object from 0.6441726684570313 (without the last 0). Yes both represent the same value and if used in calculations will give the same results, but they are different JSON objects.