Dictionaries as records
Skip Montanaro
skip at pobox.com
Wed Dec 19 17:49:05 EST 2001
Bill> I have a file with 200K records and 16 fields. This file is
Bill> parsed and each row is put into a dictionary and the dictionary is
Bill> added to a list. The raw file is only about 50mb. I was shocked
Bill> to see that my memory use jumped to 500MB!
Skip> You might want to consider storing it in an on-disk mapping
Skip> object. Check pout the anydbm, bsddb, gdbm modules and the like.
Yitz> That will solve the memory problem, but it may be slow due to disk
Yitz> access.
That obviously depends on his access patterns. If disk delays are a
problem, implementing a simple LRU cache is a reasonable way to deal with
the problem while limiting memory consumption.
--
Skip Montanaro (skip at pobox.com - http://www.mojam.com/)
More information about the Python-list
mailing list