memory leak with large list??
hemoglobea at hotmail.com
Wed Jan 29 18:12:25 CET 2003
Thanks for all the good pointers. I did try using the array module,
initializing with the 'd' type, and based on my 'ps' measurements, it
was taking up just about twice as much memory as a list with the same
number of elements, which made me think that the array module wasn't
what I understood it to be. In retrospect, I relize that I was
comparing the size of a list of 12e6 pointers to 0.0 to an array of 8
byte floats, which I guess is like comparing apples to apple seeds.
This brings up two questions for me:
1. Since several people suggested it, I've been looking at Numeric
python, which looks really great. The question is, are there any
disadvantages to using Numeric over using the array module. It seems
that array is just a poor cousin of Numeric. So I assume that either
the array module has some advantage (speed or memory wise) over
Numeric, or it exists simply because Numeric is not a part of the base
python distribution. So is there any reason not to use Numeric??
2. Is there any nice way to profile memory in python? Using ps seems
like a very crude tool, but the gc module only seems to help with
leaks, and the profiler doesn't seem to deal with memory.
I do understand that its easiest to buy more memory, but I'm
eventually going to want to have several data structures with 12e6
floats (and possibly a larger number), it seems to make sense to try
to economize a bit, before I run out of addressable memory.
More information about the Python-list