Prothon Prototypes vs Python Classes
Hung Jung Lu
hungjunglu at yahoo.com
Mon Mar 29 07:02:25 CEST 2004
Harald Massa <cpl.19.ghum at spamgourmet.com> wrote in message news:<Xns94BBF152E6BBcpl19ghumspamgourmet at 220.127.116.11>...
> Can you explain to me in easy words, why it is NOT possible to integrate
> prototypes into Python to stand "side by side" with classes?
It is possible, but you will not be able to retroactively apply it to
many existing objects. You will only be able to do things with your
new customized objects.
For instance, there is a class called 'module', and in Python you
cannot add attributes to it. Similary, there is a metaclass 'type',
and you cannot add attributes nor insert hooks to it.
Either you start afresh with prototypes from ground up, or you won't
be able to modify the behavior of existing Python objects.
I believe there was already some previous attempts along the line that
you have said.
> I never had a problem to "add an attribute" to an existing object; I really
> can't see why it should be more than some small hacks to allow "adding a
> function to an existing object".
Sure, adding an attribute to *your* objects is not an issue. Adding
attributes and modify the behavior of other people's objects is the
issue. These "other people's objects" include system objects, and
objects created by third-party.
The "other people" often include yourself. It is hard to explain.
Maybe I can suggest reading my previous posting:
There are quite a few software development needs that one only
discovers when one goes to large projects, with various components,
maybe even in different languages.
It is only when things get complex that you wish you had a clean and
pure foundation. When your projects are small, deficiencies and
impurities in your language don't matter too much.
I think the current way how OOP is taught is kind of bad. The lectures
would start with definition of classes, inheritance, virtual
As I have mentioned a few times in this mailing list, software
engineering, and all human intellectual activities, ultimately come
down to factorization (of code, or of tasks). From simple algebra to
supersymmetric quantum field theory, it has been so. From goto
statements to OOP to metaclasses to AOP to prototype-based, it has
Instead of starting with dogmas and axioms, people can probably better
focus on factorization and how it happened. People don't just invent
OOP or prototype-based language out of blue, nor did they come up with
database normalization rules out of blue. People arrived at these
devices because they observed: (1) similar tasks or code spots all
over places, that is, they discovered a symmetry, a repetitive
pattern, which often was becoming painful to deal with, (2) they then
figured out a way to factorize the code or organize the tasks, so to
factor out the common part and make their lives less painful.
It's only after (2) that they invent a new concept or technology, and
from then on they know that in the future they can start right away
with the new approach, instead of writing the same code in two spots
and later having to factorize them out.
I often don't know how to take it when I see people talking about OOP
by using definitions like: polymorphism, data hiding, etc. As if these
definitions were something of utmost importance. To me, OOP is just a
tool for factorizing code, just like using for-loops and using
functions to factor out repetitive code. Polymorphism, data hiding,
etc. are all secondary features: code factorization is the heart and
soul of OOP. Class-based OOP is a way of factorizing. Prototype-based
is just another way of factorizing, which seems to be more elegant:
instead of two concepts (classes and instances), you unify them and
have only one concept (objects). Moreover, in a prototype-based
language like Io, even scopes and objects are unified.
In C++, many new programmers get confused about the usage of macros,
templates (generics in Java in C#) and multiple inheritance (mix-in).
Sure, they are harder to read. But behind each device, the simple and
ultimate goal is nothing but code factorization. Metaprogramming in
Python? The same thing.
A CS professor friend of mine once said: "all problems in CS are
solved by just one more level of indexing," which has been very true
in my experience. I would like to say further that if someone truly
understands factorization and applies it at every moment, then he/she
should not only be awarded a Ph.D. in CS but perhaps also a Nobel
More information about the Python-list