[Python-ideas] Improve readability of long numeric literals
Oscar Benjamin
oscar.j.benjamin at gmail.com
Tue Feb 9 20:35:29 EST 2016
On 10 February 2016 at 01:07, Ethan Furman <ethan at stoneleaf.us> wrote:
>
> and in talking about int() accepting underscored inputs:
>
>> It seems entirely harmless here. Also for float().
I don't agree with either of those. Syntax accepted by int() is less
permissive than for int literals (e.g. int('0x1')) which is good
because int is often used to process data form external sources. In
this vain I'm not sure how I feel about int accepting non-ascii
characters - perhaps there should be a separate int.from_ascii
function for this purpose but that's a different subject.
Having float() accept underscored inputs violates IEEE754. That
doesn't mean it's impossible but why bother?
--
Oscar
More information about the Python-ideas
mailing list