[issue6632] Include more fullwidth chars in the decimal codec

Martin v. Löwis report at bugs.python.org
Tue Sep 22 13:35:44 CEST 2009


Martin v. Löwis <martin at v.loewis.de> added the comment:

> int()/float() use the decimal codec for numbers - this only supports
> base-10 numbers. For hex numbers, we'd need a new hex codec (only
> the encoder part, actually), otherwise, int('a') would start to return
> 10.

That's not true. PyUnicode_EncodeDecimal could happily accept hexdigits,
and int(u'a') would still be rejected. In fact, PyUnicode_EncodeDecimal
*already* accepts arbitrary Latin-1 characters, whether they represent
digits or not. I suppose this is to support non-decimal bases, so it
would only be consequential to widen this to all characters that
reasonably have the Hex_Digit property (although I'm unsure which ones
are excluded at the moment).

----------

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue6632>
_______________________________________


More information about the Python-bugs-list mailing list