Dictionaries as records
bwilk_97 at yahoo.com
Wed Dec 19 00:58:40 CET 2001
I have been happily using a list of dictionaries to hold table data for
For the first time, this method is proving less than efficient because of
of memory overhead the dictionaries produce. I have a file with 200K
16 fields. This file is parsed and each row is put into a dictionary and
the dictionary is
added to a list. The raw file is only about 50mb.
I was shocked to see that my memory use jumped to 500MB! When I delete the
list the memory is returned to the system, so I know that the memory is
being used in the dictionaries.
What strikes me as odd is that I can create a list of 200K dictionaries with
test data (a copy of the same record over and over) and the amount of memory
used is only half.
Having read many of the articles on this newsgroup about how dictionaries
are sized, I am aware of some of the memory issues involved with using a
great number of dictionaries as I am.
Can someone who has faced this issue and found a workaround please fill me
in. I know one can use a list of lists or a list of tuples, but had rather
stick to the dictionaries because of some library issues.
Thanks in advance,
More information about the Python-list