[Python-Dev] Evil setattr hack
Guido van Rossum
guido@python.org
Tue, 15 Apr 2003 14:33:48 -0400
[Guido]
> >I've checked in what I believe is an adequate block for at least
> >this particular hack. wrap_setattr(), which is called in response
> >to <type>.__setattr__(), now compares if the C function it is
> >about to call is the same as the C function in the built-in base
> >class closest to the object's class. This means that if B is a
> >built-in class and P is a Python class derived from B,
> >P.__setattr__ can call B.__setattr__, but not A.__setattr__ where
> >A is an (also built-in) base class of B (unless B inherits
> >A.__setattr__).
> From: "Phillip J. Eby" <pje@telecommunity.com>
> Does this follow __mro__ or __base__?
It follows __base__, like everything concerned about C level instance
lay-out.
> I'm specifically wondering about the implications of multiple
> inheritance from more than one C base class; this sort of thing
> (safety checks relating to heap vs. non-heap types and the "closest"
> method of a particular kind) has bitten me before in relation to
> ZODB4's Persistence package.
It is usually impossible to inherit from more than one C base class,
unless all but one are mix-in classes, meaning they add nothing to the
instance lay-out of a common base class.
> In that context, mixing 'type' and 'PersistentMetaClass' makes it
> impossible to instantiate the resulting metaclass, because neither
> type.__new__ nor PersistentMetaClass.__new__ is considered "safe" to
> execute.
You're referring to this error message from tp_new_wrapper(), right:
"%s.__new__(%s) is not safe, use %s.__new__()"
> My "evil hack" to fix that was to add an extra PyObject *
> to PersistentMetaClass so that it has a larger tp_basicsize than
> 'type' and Python then considers it the '__base__' type, thus
> causing its '__new__' method to be accepted as legitimate.
Is this because the algorithm in best_base() picks the wrong base
otherwise?
--Guido van Rossum (home page: http://www.python.org/~guido/)