Learning OOP...
Glyph Lefkowitz
glyph at twistedmatrix.com
Sun Jun 10 18:14:17 EDT 2001
On Sun, 10 Jun 2001, Alex Martelli wrote:
> "Glyph Lefkowitz" <glyph at twistedmatrix.com> wrote in message
> news:mailman.992131396.542.python-list at python.org...
> ...
> > Furthermore, I'd say that creating reusable code is *wrong*.
>
> Interesting strawman, but you haven't come anywhere close
> to 'proving' it in the following.
I'm sorry, I should have been more clear. "reusable" code is wrong;
"usable" code is right. Re-use implies recycling, dragging something out
of the garbage to use it again. For example, when you put on your new
pair of shoes for a second time, you usually don't think of it as re-use
(although you are technically "Using them again"), but it *is* re-use when
those shoes are a hand-me-down from an older relative, and have thus
exceeded their intended usefulness.
> ... "The unit of reuse is the unit of release" ... Robert Martin ...
Excellent! Where can I find some writings by this fellow? :)
> > projects. Keep your systems small and have them communicate clearly and
> > sparsely over encapsulation boundaries, and your code will be useful for
> > years; and you'll never have to re-cycle it.
>
> "Interface segregation principle", "dependency inversion principle",
> etc. Again, www.objectmentor.com has good papers on this:-).
I hardly think that object orientation has a monopoly on this approach :).
The reason that I don't think that these are really OO principles is that
the most successful component architecture and reuse framework that I've
ever seen is UNIX, which says nothing about objects; only executables,
files, and streams.
The most revolutionary development in software engineering was the
subroutine, closely followed by the data structure. The notion of
"objects" has been a useful tool to think with, but much more of a small
incremental improvement than the revolution of the subroutine :)
> > Insofar as OO helps, it is because OO has a simple and relatively
> > universal metaphor for communicating between programs written with vastly
> > different design assumptions, even in different languages: message sending
> > between objects. However, OO also has the notion of inheritance, which is
> > a *huge* impediment to reuse between projects (but can be useful for other
>
> Except that it's not, when applied well. Python shows well how
> suitably designed inheritance can make for smooth framework-code
> reuse, e.g. in sgmllib/htmllib and the whole server hierarchy. Code
> generators should generate base-classes that are specialized by
> inheritance, *NOT* by editing generated sources (a round-trip
> nightmare). Mixins allow seamless reuse. Etc, etc...
I haven't had good experiences either with inheritance in either of those
examples you mentioned. "Aggregate, not inherit". Especially in python,
when one can easily forget the oh-so-subtle method chain;
def __init__(self, *args, **kw):
apply(self.__class__.__bases__[0].__init__, (self,)+args, kw)
self.initOtherStuff()
> It's true that implementation inheritance CAN too easily create
> unintended strong coupling. Scott Meyer's rule of thumb to
> "never inherit from a concrete class" (in a C++ setting) shows
> to what extremes one could go to ward off that danger...
A number of programmers I've met adhere to this rule (Moshe Zadka in
particular, I believe ^_^), and I aspire to now. Twisted Python (PLUG:
http://twistedmatrix.com ) began life with a rather entangled inheritance
structure and I've almost completely deconstructed it... and it's much
easier to use for it.
> > things). Also, some systems which claim to be "OO" also have static
> > typing, which partially negates the value of sending messages.
> ...and partially enhances it.
Inheriting across language boundaries (where this communication benefit is
felt the most) is almost *never* a good idea. If there were a neater way
to say "I implement this" than inheriting a base class in Jython, I'd use
it. After all, if I'm inheriting a 'final' method from a Jython class and
I redefine it (or assign a similiarly-named attribute to an instance) what
semantic should that have? What about multiple inheritance? Better yet,
trying to do cross-language metaclass hacks? :) It's a semantic accident
if any of that stuff works, which I think indicates the fragility of
inheritance.
> Again, it demands very solid design (I wouldn't be Pythoning if I
> didn't think the flexibility of dynamic typing was more often useful
> than the rigidity of static typing, but I do have to play devil's
> advocate on this thread, I think:-) but when you have it you're well
I can see advantages to static typing and efficiency as well, but I see
the need to do so. What would that make me? "God's advocate?" We can
all see that Python has the divine will on its side now... ;-)
> placed. Templates (or other mechanisms for compile-time genericity)
> recover a lot of that flexibility at no runtime cost -- Coplien's
> "pattern without a name" is IMHO mostly aboyt that (template <class X>
> class Y: public X, etc). Basically, templates give you
> signature-based polymorphism, like Python's, at compile-time only
> (thus less flexible, but easier to generate speedy code for, and
> sometimes allowing somewhat earlier error diagnosis).
The "error diagnosis" argument I can't really argue with (C++ compilers
*do* describe more errors at compile-time), but it smells funny to me.
I'd guess that C++ just introduces more errors so it can diagnose them.
After all, which do you find is usually more error-ridden on the first run
-- C++ code or Python code you've written? :)
> > This is also, IMHO, against the spirit of object orientation. Smalltalk,
>
> I'd like to see you say that to Bertrand Meyer's face, given how
> intensely he maintains that "the spirit of OO" is statically typed!-)
> Me, I might quote the Bard by saying "a pox on both their houses",
> but I'll take the reverse tack, praising BOTH dynamic and static
> typing. I think OO works well with both, in different ways.
Mr. Meyer's thoughts notwithstanding; static typing is a way of making OO
more linearly isomorphic to structured programming. For example;
---
class Foo:
def x(self):
blah()
...
f = Foo()
f.x()
---
means that we have to figure out what kind of thing 'f' is before we go to
send the message 'x' to it. However,
---
class Foo {
public:
void x() {
blah(); }};
...
Foo* f = new Foo();
f->x();
---
is morally equivalent to
---
typedef struct _Foo {} Foo;
...
Foo* f = malloc(sizeof(Foo));
Foo_x(f);
---
So the compiler figures out that it's Foo_x we want to call. If OO is the
difference between thinking of data as "behaviorally" instead of
"structurally" composed, then non-virtual functions (and the static type
information that makes them possible) are against the spirit of OO.
When giving this argument in the past, I've been told that my C example is
just code written "in an object oriented style". If that's all it takes
to have object oriented style, then OO is nothing but a technically
sophistocated way of saying "Look! Our programs can deal with both code
*AND* data!"
> > have anything to do with object orientation. Certainly, provable
> > correctness and static typing are at odds with reuse, since they make your
> > software more resistant to change.
>
> Making software resistant to change is neither an intrinsically
> bad thing, nor one at odds with software reuse. I DO want the
> software that I'm reusing to be unchangeable from the point
> of view of behavior that I, the reuser, observe.
IMHO making software resistant to change *is* an intrinsically bad thing
Embrace change! :). Although I appreciate the desire for the behavior to
remain constant across project boundaries, I don't see that static typing
helps with this. Python's "file" interface convention has done a lot
towards aiding re-use; I believe that's partially because it's so easy to
implement.
> > The easiest way to have a responsive, efficient program is to prototype
> > and prototype and prototype in a flexible, "slow" language, until you've
> > got the fastest possible high-level algorythm, then optimize for constant
>
> Nolo contendere here, but it's SO important it's worth quoting
> just to increase the likelihood that it gets read:-).
Yes, let's just leave it here :)
I also think I should throw in my favorite quote about object-orientation
here. From Ian Joyner's paper (book?) "C++??", at
http://www.elj.com/cppcv3/
In his conclusions:
Perhaps the most important realisation I had while developing this
critique is that high level languages are more important to
programming than object-orientation. [...]
In a nutshell, an object-oriented language that lacks the
qualities of a high level language entirely misses the point of
why we have progressed from machine coding to symbolic assembler
and beyond. **Without the essential high level qualities, OO is
nothing but hype.** Eiffel shows that it is important to be high
level as well as OO, and I hope that the lesson to be learned by
any programming paradigm, not just OO, is that the fundamental is
to make the task of programming (that is system development as a
whole) easier by the removal of the burden of bookkeeping.
(emphasis mine)
The irony, of course, is that he used this to pitch Eiffel; but my
argument is that eiffel is not high-level *enough*, so we need python ;-)
in-real-life-you-only-have-inheritance-when-people-die-ly y'rs,
______ __ __ _____ _ _
| ____ | \_/ |_____] |_____|
|_____| |_____ | | | |
@ t w i s t e d m a t r i x . c o m
http://twistedmatrix.com/users/glyph
More information about the Python-list
mailing list