memory efficient set/dictionary

Steven D'Aprano steve at
Sun Jun 10 17:29:05 CEST 2007

On Sun, 10 Jun 2007 07:27:56 -0700, koara wrote:

> What is the best to go about using a large set (or dictionary) that
> doesn't fit into main memory? What is Python's (2.5 let's say)
> overhead for storing int in the set, and how much for storing int ->
> int mapping in the dict?

How do you know it won't fit in main memory if you don't know the
overhead? A guess? You've tried it and your computer crashed?

> Please recommend a module that allows persistent set/dict storage +
> fast query that best fits my problem, 

Usually I love guessing what people's problems are before making a
recommendation, but I'm feeling whimsical so I think I'll ask first.

What is the problem you are trying to solve? How many keys do you have?
Can you group them in some way, e.g. alphabetically? Do you need to search
on random keys, or can you queue them and do them in the order of your

> and as lightweight as possible.
> For queries, the hit ratio is about 10%. Fast updates would be nice,
> but i can rewrite the algo so that the data is static, so update speed
> is not critical.
> Or am i better off not using Python here? Cheers.


More information about the Python-list mailing list