python memory analysis
aleaxit at yahoo.com
Mon Sep 6 23:26:06 CEST 2004
Alexander Hoffmann <alexander.hoffmann at netgenius.de> wrote:
> I wrote a deamon like program in python and after a while I realized that it
> consumes more and more memory over time. So my question is: How to debug
> memory leaks in python ?
> Of course I found the built-in profiler but it only helps to make
> performance analysis. Is there a way (or a tools) to view all the instances
> that are kept im memory by the interpreter ?
It's actually quite a problem, sigh. gc.get_objects() gives you a list
of all objects tracked by the garbage collector (pretty close to, "all
the objects" period) but that's just the beginning, now you have to find
out what each object IS, why is it around (functions get_referents and
get_referrers can help), and hardest of all how much memory that
particular object IS actually consuming.
How to estimate the latter is a headache. E.g. consider a list of
integers. A list with a 1,000,000 items which are all worth 1438397
presumably has a single copy of that particular int shared by all slots
(but that's not certain... you'd need to check for that) so you're
paying for a million slots plus one int (and there is no clear doc as to
how much that means in bytes on a given platform, btw). If the items
are all different then you're paying for a million slots plus a million
ints (unless some of those ints are actually also used and still needed
Still I really wish I had a call gc.memory_howmuch(objs) returning a
tuple of two ints X <= Y, telling me that the memory taken up by all the
objs together, the memory I could get back for other purposes if all
references to all of the objs went away, is somewhere between X and Y
bytes -- i.e. if all of the objs went away I'd save at least X bytes but
surely no more than Y. How small one could make Y-X, how tight the
bounds could be, may be harder to gauge, but it may depend on how much
one wants to invest in digging deep down...
More information about the Python-list