Is 100,000 entries big for a dictionary?

Tim Peters at
Sun Dec 31 23:02:12 CET 2000

> The implementation limit is currently one billion objects.

> It occurs to me, in light of the earlier thread on "is", that the
> limit is likely to hit even faster than that, depending on how id() is
> implemented.

Yes, it will be hit much faster.  id() is a 32-bit int on most machines, and
is simply the address of the object.  Since all Python objects contain at
least a refcount and a type pointer, on most machines a PyObject occupies a
bare minimum of 8 bytes, so divide your virtual address space by at least 8.
If you use nothing but ints, at least 12 <wink>.  Python also ensures that a
hash table is never more than 2/3 occupied.  So 60 million is a more
realistic-- but still wildly optimistic --upper bound (e.g., on Win32 the
user VM space spans 31 bits, not 32, so lose another factor of 2 right

> ...
> Incidentally, I think you were insufficiently pedantic in that thread
> when you suggested "id(a) == id(b)".  It really should be
> import __builtin__
> if ==
>     ....

Actually, since __builtin__ is writable, you should really arrange to
capture the then-current at startup time.  Watch out for
import hooks too:  "import" may not do what you expect <wink>.

the-only-python-program-i'm-sure-about-is-"x=1"-ly y'rs  - tim

More information about the Python-list mailing list