Encapsulation, inheritance and polymorphism
steve+comp.lang.python at pearwood.info
Thu Jul 19 14:59:56 CEST 2012
On Wed, 18 Jul 2012 23:09:13 -0700, rusi wrote:
> Its not so much a question of language as in programming as language as
> in layman-speak.
> One characteristic with our field is that we take ordinary words and
> then distort them so much the original meaning is completely lost.
All technical fields have jargon. Those jargon terms are often more
precise than the ordinary terms they are derived from, or have a slightly
different meaning, or both.
This is not unique to programming, nor is it anything to be disturbed by.
Words change. What matters is whether the new meanings cause confusion or
> Take 'computer' for example. For Turing a computer was a mathematician
> doing a computation with a pen and paper. He then showed how to
> de-skill the mathematician so much that a machine could do what he was
> doing. In trying that he also hit upon the limits of such 'de-skilling'
> -- human-computers routinely detect infinite loops, whereas
> machine-computers can never do so (in full generality).
Do they really? I doubt that. Human-computers *sometimes* detect infinite
loops, but there's no evidence that they can detect ∞-loops in full
generality, or even that they are better at it than electrical computers.
In fact, the sheer number of accidental ∞-loops programmed by human
beings suggests that people *cannot* routinely detect them, at least not
without special training, and even then not *routinely*.
Generally people detect ∞-loops with an extremely simple-minded
heuristic: "if the loop hasn't finished by now, it will never finish".
The fact that we don't, as a general rule, program our computers to
detect ∞-loops does not mean that they cannot do so at least as well as
humans, and probably better, when we bother to program them to.
For example, both ML and Haskell can, under some circumstances, report a
type-error for an infinite loop, *at compile time*.
If you think that people can routinely detect infinite loops, then
perhaps you would care to tell me whether this is an infinite loop or not:
i = 1
while not is_perfect(i):
i += 2
print "odd perfect number discovered"
where is_perfect() returns True if the integer argument is perfect, and
> 'Object' (and OO) are similar messes.
> In layman-speak and object is well, a thing.
Is a cloud a thing? What about a fog bank? An ocean?
A desert? The atmosphere?
Is the paint on your car a thing?
I have an axe that has been passed down for generations through my
family, from my father, his father before him, and his father, and his
father before him. Occasionally we replace the handle, or put on a new
head, but that axe is almost as good as the day my great-great-
grandfather made it.
Is that axe a thing?
Just because laymen toss around a word, doesn't mean that it is a well-
defined, meaningful word.
> When it doesn't the success is poorer. eg a programmer writing math
> software in/on a OO system may for example 'clone' a matrix. This may
> be good science-fiction; its bad math.
In what way is it bad maths?
> And one of the most pervasive (and stupidist) metaphors is the parent-
> child relation of classes.
> Just for the record, in the normal world 'creatures/beings' reproduce
> and therefore inherit.
Incorrect. In the normal world, "inherit" is used for at least four
1) to inherit wealth, property, a title, debts etc. from an ancestor
upon their death;
2) to receive or take by birth, to have by nature, physical or mental
qualities, e.g. "he inherited his tendency to melancholy from his
3) to come into possession of, e.g. "the meek shall inherit the earth";
4) to receive from a predecessor, e.g. "the Prime Minister has inherited
an economy in dire straits".
It is clear that the sense of inheritance used in OO programming is sense
#2, to have by nature.
> Objects dont.
More information about the Python-list