Python 1.6 The balanced language

Alex Martelli aleaxit at yahoo.com
Fri Sep 1 17:18:07 EDT 2000


"Darren New" <dnew at san.rr.com> wrote in message
news:39AFD71D.134D0FB7 at san.rr.com...
> Alex Martelli wrote:
> > So, this needs to be specified anyway, quite apart from the "safe
> > to collect" issue; it's a more general question of whether it's
> > safe to *alter* objects through certain references ("collecting"
> > being just one way to perform such alterations, albeit a very
> > particular one).
>
> Depends on what you mean by "safe". If by "safe" you mean "according to
> specificiation of the problem", then yes. If by "safe" you mean what
> language designers mean, which is "defined behavior according to the
> language specification", then no.

"safe" as in "will not make the program do things it shouldn't".  Whether
a program crashes because the language specification is violated, or
because a semantic constraint (precondition) is, it's still a crash (and
not quite as bad as a program NOT crashing but corrupting persistent
data in subtly-wrong ways...).


> In other words, if you change an object to which I'm holding a reference,
> I'll still do what you told me to. If you GC an object to which I'm still
> holding a reference, you can't define the semantics of that operation.
It's

But neither can I change the semantics of what happens in other
cases, e.g. a divide by zero; whether that raises a catchable exception
or not depends on the language (and possibly on the implementation,
if the language leaves it as implementation-defined).

Will you therefore argue that a language, to be "object-oriented", must
give certain fixed semantics to divide-by-zero...?

Note that this ties back to the issue about whether I can mutate a
reference after giving it out -- i.e. whether the receiving class will
hold to that reference (and if so whether it will rely on it to stay
as it was) or make its own copies.  That's part of the receiving
class's semantics; I need to know the contract obligations of the
receiving class to perform my part of the bargain.  It's NOT an issue
of having to know the _implementation_ of the receiving class,
but one of design-by-contract... and please don't tell me, or B.M.,
that dbc is not OO:-).

Consider:

class datastuff:
    def __init__(self,denominator, formatstring="%s"):
        self.denominator=denominator
        self.formatstring=formatstring

class divider:
    def __init__(self,datastuff):
        self.denom=getattr(datastuff,'denominator',0)
        if not self.__denom:
            raise PreconditionViolation
        self.format=validateformat(getformat(
            datastuff,'formatstring',"%s"))
        self.theds=datastuff
    def oper1(self,numer):
        return self.format%(numer/self.__denom)
    def oper2(self,numer):
        return self.format%(numer/self.theds.denominator)

Can I rely on divider.oper1 not dividing-by-zero?  Yes,
presumably (assuming I haven't violated contracts e.g.
by messing with the private __denom field): __init__
tests to ensure the __denom it saves is not zero, and
that's what oper1 divides by.  But can I rely on oper2?
No, that's subject to what I may have been doing to
the .denominator field of the datastuff -- because the
divider class is keeping a reference to that specific
instance and using it without re-testing.


So, I need to know what divider's contract says about
keeping a reference to the datastuff instance passed to
__init__, or not.  If divider is free to keep such a
reference, and to assume it stays valid by its semantic
criteria (non-zero denominator), then the client code is
not free to muck with that field; and vice versa.  The
situation is not in the least different if the "mucking" is
the 'freeing' of the instance, or if it's changing the
nature of the object or its fields (a del on the
denominator field could cause an AttributeError, etc).

It's sure nice if all errors are specified to cause some
trappable exception rather than an outright crash, but
surely that's not a part of "object-orientedness" -- it's
just a nice feature that a language can specify (with a
cost, that can be huge, for its ports to platforms without
good hardware/OS support for trapping certain kinds of
invalid operations, to be sure).

But the point is that "having to know about how a
class is implemented" is not connected with the ability
to free/dereference/change an object, its fields, etc.
Rather, such freedom is ensured to client-software by
a mix of language-behaviour *and semantic contract
of the class being used*.  Not *implementation*, note:
*contract*.  An important difference.


> like the difference between saying "indexing off an array throws an
> exception" and "indexing off an array overwrites the interrupt vectors
with
> random garbage". The first is still "safe", while the second isn't.

But neither issue defines whether a language is OO.

> > but tying the "OO"ness
> > of a language to it seems to be a typical case of "hidden agenda"
>
> Uh... As I said, that was an Alan Kay quote.  If you don't know, Alan Kay
> was arguably the guy who *invented* OO. It's hard to see how he could have
a
> hidden agenda against languages that weren't invented at the time.

He co-invented Smalltalk, but Simula had been around for a while (and
Kay had used it at University).  And, is the quote dated from before the
birth of yet other languages which I'd call object-oriented?  It's so
suspiciously similar to the FAQ that some guy from Geodesic used (no
more than 10 years ago, I'm sure, and _without_ quoting Kay) in a
not-so-hidden attempt to sell his company's product (rather than just
making his cases, for GC and against the RC approach to GC, he
claimed [and didn't prove] that GC was *NECESSARY* [note: there is
a HUGE gulf between "extremely useful, handy, and all-around good",
and "necessary"; I get peeved very easily by such attempts to play
semantic legerdemain...] to object-orientation).


> > "subtyping exists" is another contentious issue, of course; if I
> > removed "extends" from Java, requiring all polymorphous use to
> > go through "interface" and "implement" (as, e.g., Coad has suggested
> > as best design practice), would I "totally lose OO-ness"?  Are
> > languages based on 'prototype objects' (with no classes at all)
> > "not OO"?  Pah...
>
> Well, Kay figured it was dynamic dispatch and GC. I personally think you
> also need some sort of "class" concept, but I'm ambivilent on how much
> inheritance you'd need. I've never used a system with classes and without
> inheritance.

See, e.g., http://www-staff.mcs.uts.edu.au/~cotar/proto/shah.txt
"""
The terms class-based approach and object-oriented paradigm have
become almost synonymous in the  computer science community and
many people think of every object-oriented system in terms of the
class-based approach. ... in fact the prototype-based approach is
simpler, and more flexible and powerful than the class-based approach.
"""

You may disagree, but I think the onus of proof is definitely on you.

Why do you think the "Self" programming language is not object
oriented, although it self-describes as such and is generally accepted
as such?  http://www.sun.com/research/self/language.html for more.

I don't mind the class-ic approach to OO, but prototype-based OO
would also have its advantages, and it seems peculiar to rule it
out as being OO at all because it lacks 'some sort of class concept'.

Dynamic dispatch, polymorphism, is what I'd call *the* one and
only real discriminant of 'OO'.


Alex






More information about the Python-list mailing list