[Types-sig] Instances vs. Classes [vs. Types]
Just van Rossum
just@letterror.com
Tue, 8 Dec 1998 01:47:00 +0100
Gordon McMillan wrote:
>For example, you get rid of the difference between an instance and a
>subclass, then reimpose one simply for the sake of having one.
>(Whatsamatta, Just, you chicken? Huh??).
I'm afraid of losing the Python I love. I'd hate to call the neighbors and
ask whether they've seen it. On the other hand, if I were a chicken, I'd be
Python poop by now.
>Disposing of that distinction (depending on how it's done) could
>have quite a few consequences. Conceptually, current Python
>fits with class models that say an intance is fundamentally
>different than a class. (Python's dynamic nature does make it
>somewhat more difficult to classify than most languages). Classes
>hold behavior, instances hold state, and an instance's class can be
>regarded as one, unified thing, created by munging together all of
>the bases. (The fact that __bases__ is a tuple of a bunch of
>different objects is an implementation detail; behavior is pretty
>much a done deal once __bases__ has been specified).
Ok, I'm here.
[ ... ]
>To tell the truth (generally a bad idea on Usenet <wink>) I'm not
>quite sure what I'm whining about. Is it:
>
>1) In current Python, all the mutable stuff (in the absence of
>sufficient perversity) is confined to the instance, which makes me
>feel safe and comfortable, (stop laughing, Tim!).
>
>2) The fact that it appears this proposal will mean that the
>inheritance DAG will actually have something to do with the runtime
>structure.
Well, it's possible to muck with it at runtime now, too:
class A: pass
class B: pass
print B.__bases__
B.__bases__ = (A,)
print B.__bases__
>Coming from the C++ world,
Talk to Andrew Dalke, he goes to these meetings that seem to help him a lot.
>I'm used to inheritance being
>simply a way to structure code, not objects.
It's the same in Python, but resolving is done at runtime. Python's
slowness has to come from *something*, doesn't it? As Guido mentioned that
Python 2.0 will probably initially be even slower than now, we might as
well help him a little in materializing that goal by demanding rediculously
expensive attribute lookup schemes.
>3) Confusion about "static" methods vs. "class" methods; what the
>latter are useful for, and if the 2 flavors can co-exist.
[ ... ]
>Is this what "class" methods (with a __class_self__) are for?
No, they're class methods: they work on class objects.
>How do they behave when subclassed?
That's the issue. I don't know yet. My proposal sucks eggs big time in this
area.
>I'm also concerned with the installation of __getattr__ hooks. You're
>currently looking for *some* __getattr__ hook in the bases, and using
>that. But is that a depth first search? Or just a search of
>__bases__?
Depth first: Python's standard behavior. It should take the first it finds.
But since the version of the article you've read I've changed my mind. I
called it __raw_getattr__ instead and that one will get called for all
attribute access if available. The old and trusty __getattr__ could do the
same it does now: as a fallback when the builtin mechanism has failed. That
means if you want a __raw_attr__ *and* a __getattr__, your __raw_getattr__
is the boss: it has to take care of finding and executing the custom
__getattr__. If it wants. It's the boss.
>If I have:
>
>class A(Trace):
> #blah
>
>class B(A)
> def mymethod(self, arg): #blah
>
>b = B()
>
>Will "b.mymethod(1)" trace?
Of course! Unless B implements it's own __raw_getattr__ that _doesn't_ trace.
>And more generally (in anybody's scheme), how do you handle multiple
>__getattr__ hooks? Just take the first? The last?
If we have an inheritance graph (technically, a Deep Acid Graph) like this:
D(C(B(A))) ie. B is a subclass from A, etc.
C might define a raw_getattr, and D not. A defines one, too, but B not.
If we ask D for an attribute, it will call C's raw_getattr, which might
want to ask B for an attribute, which will invoke A's raw_getattr. Etc.
This sounds more inefficient than it is: we have to recursively traverse
the bases of each anyway, probably by calling some function. Whether that
function is the same one or a different one each time doesn't matter that
much. Of course a C implementation will always beat a Python version hands
down, and the more tricks like this you do, the more you will pay. These
raw_gettr methods could obviously be cached, although that sacrifices some
dynamic-ness: what if we want to muck with __bases__? (And I do...) So
you're right. It stinks.
And what will happen if I do this:
class A:
def __raw_getattr__(self, attr_): # blah
class B:
# no __raw_getattr__
def foo(self): # blah
a = A()
b = B()
a.__bases__.append(b) # cheap delegation!?
???
That's an idea I have: since any object can have more bases, instances, can
have more bases, too. In the above example I may want to fetch methods
through 'a', which belong to 'b', therefore could be bound to 'b'! Using my
__name__ artifact (shut it Gordon!) we could maybe stop re-binding in the
getattr chain at the last nameless object. a.foo() would be bound to b.
Catch my drift?
>Let one wrap the
>attribute and hand it to the next to wrap again?
Maybe not wrap again but rather _bind_ again.
everything-is-a-getattr-function-ly y'rs -- just