[Python-Dev] Decimal(unicode)

Nick Coghlan ncoghlan at gmail.com
Wed Mar 26 08:06:34 CET 2008


Martin v. Löwis wrote:
>> For binary representations, we already have the struct module to handle 
>> the parsing, but for byte sequences with embedded ASCII digits it's 
>> reasonably common practice to use strings along with the respective type 
>> constructors.
> 
> Sure, but why can't you write
> 
>  foo = int(bar[start:stop].decode("ascii"))
> 
> then? Explicit is better than implicit.

Yeah, this thread has convinced me that it would be better to start 
rejecting bytes in int() and float() as well rather than implicitly 
assuming an ASCII encoding.

If we decide the fast path for ASCII is still important (e.g. to solve 
3.0's current speed problems in decimal), then it would be better to add 
separate methods to int to expose the old 2.x str->int and int->str 
optimisations (e.g. an int.from_ascii class method and an int.to_ascii 
instance method).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://www.boredomandlaziness.org


More information about the Python-Dev mailing list