Martin v. Löwis wrote:
I think it's a bug that this works. The definition of the float builtin says
I think that's a documentation bug rather than a coding bug. If Python wishes to limit the digits allowed in numeric *literals* to ASCII 0...9, that's one thing, but I think that the digits allowed in numeric *strings* should allow the full range of digits supported by the Unicode standard.
The former ensures that literals in code are always readable; the later allows users to enter numbers in their own number system. How could that be a bad thing?