is decorator the right thing to use?

Dmitry S. Makovey dmitry at athabascau.ca
Fri Sep 26 18:38:46 EDT 2008


Bruno Desthuilliers wrote:
> Hem... I'm afraid you don't really take Python's dynamic nature into
> account here. Do you know that even the __class__ attribute of an
> instance can be rebound at runtime ? What about 'once and for all' then ?

must've been wrong wording on my part. Dynamic nature is exactly what I
wanted to use :) except that I do not expect clients to take advantage of
it while using my classes ;)

>>> Your original question was "is decorator the right thing to use?"  For
>>> this application, the answer is "no".
>> 
>> yeah. seems that way. in the other fork of this thread you'll find my
>> conclusion which agrees with that :)
>> 
>>> It sounds like you are trying
>>> to force this particular to solution to your problem, but you are
>>> probably better off giving __getattr__ intercepting another look.
>> 
>> __getattr__ implies constant lookups and checks (for filtering purposes)
> 
> Unless you cache the lookups results...

sure

>> - I
>> want to do them once, attach generated methods as native methods
> 
> What is a "native method" ? You might not be aware of the fact that
> method objects are usually built anew from functions on each method
> call...

again, wrong wording on my part. by native I meant: make use as much as
possible of existing machinery and override default behavior only when it's
absolutely necessary.  (hopefully my wording is not off this time ;) )

>> and be
>> done with it. That is why I do not like __getattr__ in this particular
>> case.
> 
> There's indeed an additional penalty using __getattr__, which is that
> it's only called as a last resort. Now remember that premature
> optimization is the root of evil... Depending on effective use (ie : how
> often a same 'proxied' method is called on a given Proxy instance, on
> average), using __getattr__ to retrieve the appropriate bound method on
> the delegate then adding it to the proxy instance *as an instance
> attribute* might be a way better (and simpler) optimization.

I actually ended up rewriting things (loosely based on George's suggested
code) with descriptors and not using metaclasses or decorators (so much for
my desire to use them). 

With following implementation (unpolished at this stage but already
functional) I can have several instances of B objects inside of A object
and proxy certain methods to one or another object (I might be having a
case when I have A.b1 and A.b2 and passing some methods to b1 and others to
b2 having both of the same class B, maybe even multiplexing). This one
seems to be fairly light as well without need to scan instances (well,
except for one getattr, but I couldn't get around it). Maybe I didn't
account for some shoot-in-the-foot scenarios but I can't come up with any.
Last time I played with __getattr__ I shot myself in the foot quite well
BTW :)

class ProxyMethod(object):

    def __init__(self,ob_name,meth):
        self.ob_name=ob_name
        self.meth=meth

    def my_call(self,instance,*argv,**kw):
        ob=getattr(instance,self.ob_name)
        cls=self.meth.im_class
        return self.meth.__get__(ob,cls)(*argv,**kw)

    def __get__(self,instance,owner):
        if not instance:
            return self.my_call
        ob=getattr(instance,self.ob_name)
        cls=self.meth.im_class
        return self.meth.__get__(ob,cls)

class B:
    def __init__(self):
        self.val='bval'

    def bmethod(self,a):
        print "B::bmethod", 
        print a, self.val

class A:
    b=None

    def __init__(self,b=None):
        self.val='aval'
        self.b=b
        b.val='aval-b'

    def mymethod(self,a):
        print "A::mymethod, ",a
        
    bmethod=ProxyMethod('b',B.bmethod)

b=B()
b.bmethod('foo')
a=A(b)
b=B()
b.val='newval'
a.mymethod('baz')
a.bmethod('bar')
A.bmethod(a,'zoom')




More information about the Python-list mailing list