Memory usage, in droves
Martin von Loewis
loewis at informatik.hu-berlin.de
Fri Jul 27 16:42:03 CEST 2001
"Danyel Fisher" <danyelf at ics.uci.edu> writes:
> So I want to figure out the memory leaks--what objects am I not letting go?
> What is the GC missing?
> I can certainly search for the inefficiencies and the errors by hand, but it
> seems to me that there ought to be a way to generate a table of the form
> <object name> <object size>
When you configure the compiler with --with-pydebug, it will keep
track of all objects, and print a list of remaining objects at exit.
My suspicion is that you are putting objects into global containers
(dictionaries, lists), and never remove them from there. So you might
check all caches, lookup tables, etc. whether they are constantly
growing, and perhaps keep objects that you were already considering
If you have a suspicion that certain types of objects are ever
increasing in count, you could put a counter into __init__ and __del__
to keep track of how many objects you have. Be careful not to add a
__del__ to an object in a cycle, though.
If you find such objects, and wonder why they are never released, you
can look at their sys.getrefcount, to see how many "unexplained"
references to these objects you have.
If you need to find out where these references come from, you can try
my gc.getreferents patch, which finds all containers containing a
Hope this helps,
More information about the Python-list