memory use with regard to large pickle files
castironpi at gmail.com
Sun Oct 19 05:00:17 CEST 2008
Catherine Moroney wrote:
> I'm writing a python program that reads in a very large
> "pickled" file (consisting of one large dictionary and one
> small one), and parses the results out to several binary and hdf
> The program works fine, but the memory load is huge. The size of
> the pickle file on disk is about 900 Meg so I would theoretically
> expect my program to consume about twice that (the dictionary
> contained in the pickle file plus its repackaging into other formats),
> but instead my program needs almost 5 Gig of memory to run.
> Am I being unrealistic in my memory expectations?
> I'm running Python 2.5 on a Linux box (Fedora release 7).
> Is there a way to see how much memory is being consumed
> by a single data structure or variable? How can I go about
> debugging this problem?
There's always the 'shelve' module.
More information about the Python-list