huge persistent dictionary

Dale Strickland-Clark dale at out-think.NOSPAMco.uk
Tue Nov 7 08:42:32 EST 2000


"gunter " <gb at a3design.de> wrote:

>Hello!
>
>My Problem is, that I want to store an huge amount of data in an
>dictionary and to make it persistent.
>
>The data consists of an dictionary storing dictionarys of
>string/string pairs.
>
>My first solution is straigt-forward using cPickle and dump/load.
>This works but it is far to slow!
>
>I found out another way:
>Writing an large text-file like this:
>
>f = open("data.py","w")
>f.write("data=" + str(data))
>f.close()
>
>now I have an big file (about 2 Megs)
>I compile it with compileall.compile_dir(".") and I get an
>data.pyc (a little bit smaller than the pickled version!)
>
>and: 
>after an "import data" i have my data back.
>And this goes verrrrrrrry fast! (50 times faster than the cPickled
>version!)
>
>PROBLEM:
>compiling this large file "data.py" takes about 5 Minutes and takes
>about 50 Megs of Memory.
>
>QUESTION:
>Is there any other way to make such structures of primitive types
>persistent?
>The compiling-trick-thing is a little rude, i think!
>
>yours
>gunter
>
>

Use a database and write your own cached front-end with a dictionary.

If the key being accessed isn't in the dictionary, get it out of the
database and store it in the dictionary for next time.




--
Dale Strickland-Clark
Out-Think Ltd
Business Technology Consultants





More information about the Python-list mailing list