weakrefs and bound methods

Alex Martelli aleax at mac.com
Sun Oct 7 21:55:29 CEST 2007

Steven D'Aprano <steve at REMOVE-THIS-cybersource.com.au> wrote:
> Without __del__, what should I have done to test that my code was 
> deleting objects and not leaking memory?

See module gc in the Python standard library.

> What should I do when my objects need to perform some special processing
> when they are freed, if I shouldn't use __del__?

The solid, reliable way is:

from __future__ import with_statement

and use module contextlib from the Python standard library (or handcode
an __exit__ method, but that's rarely needed), generating these special
objects that require special processing only in 'with' statements.  This
"resource acquisition is initialization" (RAII) pattern is the RIGHT way
to ensure timely finalization (particularly but not exclusively in
garbage-collected languages, and particularly but not exclusively to
ease portability to different garbage collection strategies -- e.g.,
among CPython and future versions of IronPython and/or Jython that will
support the with statement).

An alternative that will work in pre-2.5 Python (and, I believe but I'm
not sure, in Jython and IronPython _today_) is to rely on the weakref
module of the standard Python library.  If your finalizer, in order to
perform "special processing", requires access to some values that depend
on the just-freed object, you'll have to carefully stash those values
"elsewhere", because the finalizer gets called _after_ the object is
freed (this crucial bit of sequencing semantics is what allows weak
references to work while "strong finalizers" [aka destructors] don't
play well with garbage collection when reference-loops are possible).
E.g., weakref.ref instances are hashable, so you can keep a per-class
dict keyed by them to hold the special values that are needed for
special processing at finalization, and use accessors as needed to make
those special values still look like attributes of instances of your

E.g., consider:

import weakref

class ClosingAtDel(object):
    _xs = {}
    def __init__(self, x):
        self._r = weakref.ref(self, self._closeit)
        self._xs[self._r] = x
    def x(self):
        return self._xs[self._r]
    def _closeit(cls, theweakref):
        del cls._xs[theweakref]

This will ensure that .close() is called on the object 'wrapped' in the
instance of ClosingAtDel when the latter instance goes away -- even when
the "going away" is due to a reference loop getting collected by gc.  If
ClosingAtDel had a __del__ method, that would interfere with the garbage
collection.  For example, consider adding to that class the following
test/example code:

class Zap(object):
    def close(self): print 'closed', self

c = ClosingAtDel(Zap())
d = ClosingAtDel(Zap())
print c.x, d.x
# create a reference loop
c.xx = d; d.xx = c
# garbage-collect it anyway
import gc
del c; del d; gc.collect()
print 'done!'

you'll get a pleasant, expected output:

$ python wr.py
<__main__.Zap object at 0x6b430> <__main__.Zap object at 0x6b490>
closed <__main__.Zap object at 0x6b430>
closed <__main__.Zap object at 0x6b490>

Suppose that ClosingAtDel was instead miscoded with a __del__, e.g.:

class ClosingAtDel(object):
    def __init__(self, x):
        self.x = x
    def __del__(self):

Now, the same test/example code would emit a desolating...:

$ python wr.py
<__main__.Zap object at 0x6b5b0> <__main__.Zap object at 0x6b610>

I.e., the assumed-to-be-crucial calls to .close() have NOT been
performed, because __del__ inhibits collection of reference-looping
garbage.  _Ensuring_ you always avoid reference loops (in intricate
real-life cases) is basically unfeasible (that's why we HAVE gc in the
first place -- for non-loopy cases, reference counting suffices;-), so
the best strategy is to avoid coding __del__ methods, just as Marc


More information about the Python-list mailing list