newbie: precision question

MRAB google at mrabarnett.plus.com
Sat Mar 21 06:57:12 EDT 2009


Lada Kugis wrote:
[snip]
> Normal integers are up to 10 digits, after which they become long
> integers, right ?
> 
> But if integers can be exactly represented, then why do they need two
> types of integers (long and ... uhmm, let's say, normal). I mean,
> their error will always be zero, no matter what kind they're of.
> 
'int' is limited to, say, 32 bits, but is faster. 'long' is slower, but
virtually unlimited. The decision was made that with the speed of modern
CPUs we could simplify things by forgetting about 'int' and using just
'long', though renamed to 'int', in Python 3.x.



More information about the Python-list mailing list