huge persistent dictionary

gunter gb at a3design.de
Tue Nov 7 14:05:10 CET 2000


Hello!

My Problem is, that I want to store an huge amount of data in an
dictionary and to make it persistent.

The data consists of an dictionary storing dictionarys of
string/string pairs.

My first solution is straigt-forward using cPickle and dump/load.
This works but it is far to slow!

I found out another way:
Writing an large text-file like this:

f = open("data.py","w")
f.write("data=" + str(data))
f.close()

now I have an big file (about 2 Megs)
I compile it with compileall.compile_dir(".") and I get an
data.pyc (a little bit smaller than the pickled version!)

and: 
after an "import data" i have my data back.
And this goes verrrrrrrry fast! (50 times faster than the cPickled
version!)

PROBLEM:
compiling this large file "data.py" takes about 5 Minutes and takes
about 50 Megs of Memory.

QUESTION:
Is there any other way to make such structures of primitive types
persistent?
The compiling-trick-thing is a little rude, i think!

yours
gunter






More information about the Python-list mailing list