Garbage collection on strike?
Mark blobby Robinson
m.1.robinson at herts.ac.uk
Thu May 10 05:37:53 EDT 2001
Hey,
I wonder if anyone can throw some enlightenment my way, cos I obviously
need some. I am currently working on a program tackling a huge
combinatorial problem and the results held in memory are getting pretty
big. My solution was to dump the data to file when the list I was
generating got to a length of a million, and then explicitly delete the
data structure and start over, but the memory doesn't seem to get
refreshed. In fact sometimes when the whole program has executed the
memory still doesn't seem to have been cleaned up and I am having to at
times do a hard reboot of the machine. I am running this on a 700Mhz
computer with 730MB RAM running Red hat 7.0. Any suggestions as to the
error of my ways?
Blobby
More information about the Python-list
mailing list