newbie: precision question

MRAB google at
Sat Mar 21 11:57:12 CET 2009

Lada Kugis wrote:
> Normal integers are up to 10 digits, after which they become long
> integers, right ?
> But if integers can be exactly represented, then why do they need two
> types of integers (long and ... uhmm, let's say, normal). I mean,
> their error will always be zero, no matter what kind they're of.
'int' is limited to, say, 32 bits, but is faster. 'long' is slower, but
virtually unlimited. The decision was made that with the speed of modern
CPUs we could simplify things by forgetting about 'int' and using just
'long', though renamed to 'int', in Python 3.x.

More information about the Python-list mailing list