huge persistent dictionary
gb at a3design.de
Tue Nov 7 14:05:10 CET 2000
My Problem is, that I want to store an huge amount of data in an
dictionary and to make it persistent.
The data consists of an dictionary storing dictionarys of
My first solution is straigt-forward using cPickle and dump/load.
This works but it is far to slow!
I found out another way:
Writing an large text-file like this:
f = open("data.py","w")
f.write("data=" + str(data))
now I have an big file (about 2 Megs)
I compile it with compileall.compile_dir(".") and I get an
data.pyc (a little bit smaller than the pickled version!)
after an "import data" i have my data back.
And this goes verrrrrrrry fast! (50 times faster than the cPickled
compiling this large file "data.py" takes about 5 Minutes and takes
about 50 Megs of Memory.
Is there any other way to make such structures of primitive types
The compiling-trick-thing is a little rude, i think!
More information about the Python-list