[Python-Dev] Weakref design questions

Guido van Rossum guido@python.org
Sat, 19 Oct 2002 06:54:44 -0400


[Brian]
> > That is definitely one possible way to do it. However, I am
> > wrapping a complete DOM, with dozens of objects containing,
> > collectively, hundreds of methods.
> > 
> > Adding an explicit check to each method seemed like a lot more
> > pain than using proxies.

[Martin]
> You don't have to add it to every method. You can perform the check in
> tp_getattro before performing the method lookup. Alternatively, you
> can change the ob_type of the object to simply drop the methods that
> are not available anymore.

No, his second example defeated that:

  def evil_python_callback(o):
     global evil_method
     evil_method = o.meth

  def called_later():
     evil_method() # SegFault

This does the getattr when it's allowed, but makes the call later.

To Brian: I really don't want to make bound methods proxyable.  (And
it wouldn't help you anyway until Python 2.3 is released.)  If you
really don't want to modify each method, you can write your own
callable type and use that to wrap the methods; this can then check
whether the object is still alive before passing on the call.  If you
think this is too expensive, go back to checking in each method --
since a check is needed anyway, you can't get cheaper than that.  (A
weakref proxy isn't free either.)  To signal invalidity, the Python
wrapper would have to set the 'o' pointer to NULL.

I also agree with Martin's observation that weakref proxies aren't
intended to make the original object undiscoverable.

--Guido van Rossum (home page: http://www.python.org/~guido/)