Basic inheritance question

Lie Lie.1296 at gmail.com
Mon Jan 21 12:26:35 EST 2008


> > Please stop taking my words to its letters.
>
> So we're supposed to actually guess what you really mean ???

That's what human does, otherwise you'll "Fail the Turing Test".

> >> Personally, I've seen many C++ programs with complex class designs
> >> where it definitely helps to consistently use "this->". I cannot
> >> remember all local (and global) variables in bigger methods.
>
> > In that case, you have the _option_ to do it.
>
> Make no sens when maintaining code wrote by someone that didn't use this
> 'option'.
>
> (snip)
>
>
>
> >>>> it's the first argument of the function - which usually happens to be
> >>>> the current instance when the function is used as a method.
> >>> And that's the point, self (or anything you name it) is almost always
> >>> the current instance
> >> # this is a plain function. In this function,
> >> # 'obj' can be whatever that happens to have a (numeric)
> >> # 'stuff' attribute
> >> def func(obj, arg):
> >>    return (obj.stuff + arg) / 2.0
>
> >> # this is a class with an instance attribute 'stuff'
> >> class Foo(object):
> >>     def __init__(self, bar):
> >>       self.stuff = bar + 42
>
> >> # this is another (mostly unrelated) class
> >> # with a class attribute 'stuff'
> >> class Bar(object):
> >>    stuff = 42
>
> >> # this is a dummy container class:
> >> class Dummy(object): pass
>
> >> # now let's play:
> >> import new
>
> >> d = Dummy()
> >> d.stuff = 84
> >> print func(d, 1)
>
> >> d.baaz = new.instancemethod(func, d, type(d))
> >> print d.baaz(2)
>
> >> f = Foo(33)
> >> print func(f, 3)
> >> Foo.baaz = func
> >> f.baaz(4)
>
> >> print func(Bar, 5)
> >> Bar.baaz = classmethod(func)
> >> Bar.baaz(6)
>
> >>>  and that makes it functionally the same as Me and
> >>> this in VB and Java.
> >> Depends on the context, cf above !-)
>
> > Please again, stop taking letters to the words, I don't meant them to
> > be exactly the same, rather the same would meant that they generally
> > can be considered equal,
>
> If you still think that way after having read the above code, then I
> can't help. We obviously don't share the same mental model here.

I don't get what you're trying to meant, in that piece of code self is
used in the regular way, in a regular context.

> > exceptions exists of course. And btw, I don't
> > understand what you meant by your example, they seemed to be a
> > completely OK program for me,
>
> Indeed it's ok (even if totally useless by itself). The point is that
> 'self' (or whatever you name it) is just and only the first argument of
> a function, period.

And it is... but I still don't get what you meant

> > even though it's a bit confusing to
> > follow[2].
>
> Nothing in it should be confusing to anyone having a decent knowledge of
> Python's object model IMHO.
>
> > [2] btw, the reason it's a bit confusing to follow is one of my
> > points: It is a Bad Thing(tm) to use the same name for different
> > variables
>
> Where do I "use the same name for different variables" here ?

It's not confusing in the way of object model, but in the way that you
used meaningless names in overloaded manner.

> > even in a language like Python that enforce explicit naming
> > of classes
>
> Python doesn't "enforce" explicit name of classes - IIRC, there are ways
> to instanciate anonymous class objects. But I definitively don't see how
> this relate to this discussion.

We're not talking about anonymous class objects here, and I'm sure you
actually understand what I meant by "naming the class" from previous
discussions.

> Yes, I know, "please guess what I mean" !-) but sorry, there's no sens
> discussing a technical point without using accurate and appropriate
> technical naming of technical concepts invlved.

Well, it's not my fault for being born as a non-native speaker of
English, sometimes what I meant might goes a bit off from what I
write. Anyway, human languages aren't designed to be fully
unambiguous, so it is natural for human to provide a margin of errors
when speaking to each other, well unless you're not a human...

> >>>>> Most other languages
> >>>>> 1) automatically assign the containing class' object
> >>>> s/containing class' object/current instance/
> >>>>> in a keyword
> >>>>> (Java: this, VB: Me) behind the screen,
> >>>> That's not very far from what a Python method object does -
> >>>> automatically assign the current instance to something. The difference
> >>>> is that Python uses functions to implement methods (instead of having
> >>>> two distinct contructs), so the only reliable way to "inject" the
> >>>> reference to the current instance is to pass it as an argument to the
> >>>> function (instead of making it pop from pure air).
> >>> It isn't very far, but Python makes it obvious about the assignment
> >>> (not behind the screen).
> >> Exactly. And given both the simplicity of the solution and what it let
> >> you do, that's a *very* GoodThing(tm) IMHO.
>
> > I agree, it's a Good Thing but it doesn't make the point less pointy,
> > the difference between Me/this and self is just the explicit
> > assignment. Other things that is possible because of the explicit
> > assignment is just a "coincidence" of design choice.
>
> Are you sure ? As far as I'm concerned, I think that the design choice
> somehow results from what it makes possible.

It's correct, IF you see it traversing from the past, when many design
choice is being made for any possible features that is required. If
you dumped the past for a while, it is just a "coincidence" of design
choice.

> >>> And it is always a Bad Thing(tm) to use the same name for two
> >>> variable in the class and in function (which is the main and only
> >>> source of possible ambiguity) in ANY language, even in Python.
> >> Ho, yes.... Like, this would be bad ?
>
> >> class Person(object):
> >>    def __init__(self, firstname, lastname, birthdate, gender):
> >>      self.firstname = firstname
> >>      self.lastname = lastname
> >>      self.birthdate = birthdate
> >>      self.gender = gender
>
> >> C'mon, be serious. It's often hard enough to come with sensible names,
> >> why would one have to find synonyms too ? Try to come with something
> >> more readable than the above, and let us know. Seriously, this braindead
> >> rule about  "not using the same name for an attribute and a local var"
> >> obviously comes from languages where the "this" ref is optional, and
> >> FWIW it's obviously the wrong solution to a real problem (the good
> >> solution being, of course, to use the fully qualified name for
> >> attributes so there's no possible ambiguity).
>
> > The code fragment you've given way above (about the Foo, Bar, bazz,
> > and func) also suffers from the bad habits of using the same name for
> > different variables.
>
> Where ? And how does this answer the question above ?



> > And it's not a "braindead" rule
>
> The way you express it, and as far as i'm concerned, it is, definitively.

Perhaps I forgot to say that always might still have exceptions.

> > The example you've given IS the most readable form since the function
> > is _simple_, consider a function that have complex codes, possibly
> > calculations instead of simply assigning initial values I'm sure you'd
> > slip up between the self.* variables and the * variables once or
> > twice,
>
> *you* would perhaps have this problem. And you would indeed have this
> problem in Java or C++. In Python, this problem just don't exist.

No it exists in any language, the way to avoid it is by good class
design.

> > possibly becoming the source of hard-to-find bugs.
>
> Consistant and intelligible naming is quite another problem. And it's
> too much dependant on the context, language, domain and whatnot for any
> rule like your one above to be universally applyable.

It is universally applicable, with some exceptions of course, such as
in a field where there is no agreed naming convention yet.

>
> > And in languages that doesn't enforce explicit naming of classes,
>
> I still don't understand how all this relates to the naming of class
> objects ?
>
> Oops, sorry: you meant "in languages that has implicit instance
> reference available in methods" ? Python doesn't have it, so any rule
> deriving from this "feature" is out of scope here.
>
> >  when
> > there is the two or more same names, the one with the smallest scope
> > is picked, so in _simple_ functions, the trick of using full qualified
> > names and overloaded local names is still possible and feasible. In
> > complex functions, the trick fails even in Python, because even if
> > Python and our full-concentration-brain is aware of the difference
> > between self.* and *, our spreaded-concentration-brain that is
> > scanning the code for the source of bugs might get stumbled on the
> > confusing use of self.* and *.
>
> Here again, *you* may have this problem. I don't, since I always used
> explicit instance reference.

I don't have that much of a problem, I only pointed out that it is
possible to miss the names overloading in quick scanning. The reason
why you never get stumbled is possibly because your brain memorizes
the names and doesn't even understand what a name actually meant. This
halved the reason of spending some time to find a meaningful name.
With your brainset, it is just possible to create a program fully with
names like foo and bar and never get stumbled. In this case, it
excels, but it's a poor man's way of trying to be smart.

> >>>> Anyway, I actually know 3 languages (4 if C# works the same) that has
> >>>> this implicit 'this' (or whatever the name) 'feature', and at least 5
> >>>> that don't. So I'm not sure that the "most other languages" qualifier
> >>>> really applies to point 2 !-)
> >>> What's this 5 languages?
> >> Smalltalk, Python, PHP, javascript, Ruby. I don't remember how Scheme,
> >> CLOS and OCaml handle the case.
>
> > Among all them, only Javascript is considerably mainstream.
>
> Is that a joke ? PHP is probably the most used language for web apps
> (server side of course). And Python is certainly not an obscure unknown
> geek-only language no more. But anyway: you were talking about "most
> other languages" - not "most mainstream languages".

Yes, I'm aware that Python and PHP indeed have larger user base than
perhaps smalltalk and Ruby, but nevertheless they're middle sized.
What's I'm talking about mainstream is large as in large, not medium.
PHP's domain has lots of competition, ASP, JSP, CGI, etc and this
market is very fragmented, even though server-side scripting plays
important role in the Internet, the role of each language is lessened
because of the fragmentation making none of them mainstream enough.
Python is shaded by Java in the market for ultra portable (VM)
language. And I'm not talking about Smalltalk and Ruby here.

> >>> Are they a mainstream, high-level languages
> >>> or lesser known, low-level languages? C-family, Java, and Basic are
> >>> the Big Three of high-level programming language.
> >> None of C, C++, Java nor Basic qualify as "hi-level". C is the lowest
> >> possible level above assembly, C++ is often refered to as an "object
> >> oriented assembler", Java is way too static, crippled, verbose an
> >> unexpressive to qualify as "hi-level" (even if it suffers from some
> >> problems usually associated with higher level languages). I won't even
> >> comment on basic (is that really a language at all ?).
>
> > Your criteria on being high-level is simply just odd.
>
>
> My criteria on being hi-level seems quite common: automatic memory
> management, portability, "rich" builtin data types, functions as first
> class citizens, lexical closures, strong introspection features, strong
> metaprogramming support, etc...

We're talking language-wise here, not the implementation, and those
criteria are for implementations. It is the case where we blindfold
our eyes against "the things behind" and see only the language from
the front. Yeah, perhaps "the things behind" would have some effect to
the front side, like it is possible to create C with automatic memory
management with little change in the language itself, it's possible to
have more built-in datatypes than the standard set without major
change in language itself. These are the criteria for language-wise
comparison.

> > The rest of the
> > world recognizes C-family, Java, and Basic as high-level languages.
>
> C was "hi-level" wrt/ assembler, indeed. And Java is "hi-level" wrt/
> C++. Ho, and _please_ don't get me started on basic !-)

And the world is content with that. C is probably the lowest shade a
high level language could be, but it is still a high-level.

> > If I have to say it, Python is actually lower level than Basic.

Language-wise, indeed it is. Implementation-wise, Python might be
higher than Basic.

> >  While
> > Java is just below Python and C and C++ is just below Java. Why do I
> > consider Basic the highest-level? Because it is the cleanest to scan
> > (no confusing symbols, i.e. no curly braces, no confusing use of
> > parens (Python uses (), [], and {}, VB only use ()[3]),
>
> Basic != VB. There are quite a few other basics here.

I'm aware of that, but VB is the one with the largest user base, and I
think it could be used to represent the language as a whole. And
VB.NET is one with possibly the highest shade of level among other
Basics.

> Now if you choose this criteria, you may want to learn some Lisp dialect.

> > In what way C++ resembles an assembler?

> C++ is some OO stuff bolted on a very close-to-the-metal language itself
> designed to write operating systems, drivers and other low-level stuff.

In what way is C++ close to the metal? It's very far actually, C-
family don't have a one-to-one relationship with assembly or plain
executable binary.

> > Have you ever programmed in
> > assembly?
>
> I did.

And surely you'd realize that the level of C and assembly is very far,
perhaps the furthest leap in programming language level.

> > How hard is it to create a simple program in assembly? And
> > how hard is it to create a complex program in C++

> Roughly the same, thanks to scoping rules, dependencies hell, lack of
> automatic memory management and overcomplex features.

By saying that you've revealed that you missed my point too far. There
is no point to continue talking about A when you're talking about B.

> > (which AFAIK is used
> > by hundreds of mega projects including CPython)?

> CPython - as the name implies - is written in C.

And by saying that I'm sure you agree that even C, which is lower
level than C++ is high-level enough to be used in large projects like
CPython. I'm interested in seeing Python implemented in pure assembly,
perhaps asmPython?

> > And have you ever used Basic at all?

> I did. And not only VB.

> > Some programmers would instantly frown upon Basic, simply because they
> > don't know that Basic is "just another language".

> I've used at least four different variants of Basic.

Then with such experience in Basic you should realize that Basic isn't
much off from other languages.

> >>>>> In VB, Me is extremely rarely used,
> >>>> I used to systematically use it - like I've always systematically used
> >>>> 'this' in C++  and Java.
> >>> And that is what reduces readability. A proficient VB/C/Java
> >>> programmer
> >> There are quite a few proficient C/C++/Java programmers here. As far as
> >> I'm concerned, I would not pretend being one - I just have a good enough
> >> knowledge of C, Java and (alas) VB to be able to get up to speed in a
> >> reasonnable time frame.
> >> As a side note, the problem just doesn't exists in C, which has
> >> absolutely no support for OO.
> > When I said C, it might mean C and C-family,
> When you say "C", it means "C" to everyone reading you.

Depending on the context, it's easy to rule out which means what.

> > so please stop
> > misunderstanding me.

> Please learn to express yourself clearly. If you say "X" when you mean
> "Y", *you* are the one responsible for misunderstandings. This is human
> communication 101 skill.

Even computers are designed to tolerate errors in data transfer, then
human is less intelligent than computers you say?

> >>> would frown upon the extra, unneeded garbage as they
> >>> thought it was clear already that the variable refers to a class-level
> > >>> variable.
> >> In C++, the canonical way to make this "clear" is to use the m_name
> >> convention. There must be some reason C++ programmers feel a need for
> >> this "extra, unneeded garbage" ?-)
>
> > In some cases, an extremely complex class that can't be fragmented any
> > further, the m_ convention is surely useful, but in most cases you
> > could skip them out.
>
> You "could" skip this convention, but it's really considered bad
> practice by quite a lot of C++ programmers.

Bad, because a lot of them are already accustomed to it and it have
become a de facto "rules".

> > And the canonical way to make this "clear" is not
> > the m_ convention, it's the name itself. A well-designed class would
> > choose names that is recognizable instantly from the name itself, even
> > without the pseudo-name appended to it (or prepended).
>
> I await for your exemples.
>
> > btw you must have been memorizing names braindeadly, because the only
> > way you could stumble on that is by memorizing names braindeadly.
> > Names shouldn't be memorized, it should be inferred and memorized. For
> > example, when you met a variable name firstname and lastname inside a
> > class called Person, you'd immediately realize that it is Class Level
> > variable because you know that the function you're currently working
> > on use the name initialfirstname and initiallastname.
>
> Fine, I now have four names to handle, each of them being possibly an
> argument, a local variable, a member variable or a global variable. Great.
>
> Sorry but I won't buy this.

THAT'S the point. You are unable to infer that the name
initialfirstname and initiallastname is a local variable, since it is
quite impossible for something initial to be a member variable or
global. If it is a global, it shouldn't be named initial, perhaps
default, if it is member variable, it should be plain vanilla
firstname and lastname. This is called inferring scope from name.

> >>> As I've pointed out, there is little harm in class-level variable's
> >>> implicit reference.
> >> Have some working experience on any non-trivial C++ project ?
>
> > No
>
> I would have been surprised if you had answer otherwise.
>
> > (you could say I'm a student so I've never "worked"[1]). But I've
> > done some medium-sized projects in other languages.
>
> > [1] If you understand the irony, you'd realized I was deliberately
> > misunderstanding you
>
> Not sure.

Don't cut my words apart, it's meant to be together or it lose its
sense of humor.

>
> >>>>> Compare the following codes:
> >>>>> VB.NET:
> >>>>> Public Class A
> >>>>>     Dim var
> >>>>>     Public Function aFunction()
> >>>>>         return var
> >>>> Add three levels of inheritence and a couple globals and you'll find out
> >>>> that readability count !-)
> >>> It's the mental model that have to be adapted here, if the current
> >>> class is inheriting from another class, you've got to think it as
> >>> names from parent class as it is a native names, so you don't actually
> >>> need to know where the variable comes from
> >> In C++ (and VB IIRC), it might as well be a global So sorry but yes, I
> >> have to know where it comes from.
>
> > How many times would you use globals, it is a Bad Thing(tm) to use
> > globals in the first case.
>
> It is, most of the time, indeed.
>
> The problem is that you rarely start a project from scratch - most of
> the time, you have to work on legacy code. And you really seldom have
> the possibility to do a major rewrite to fix all warts.

But it is always possible to reduce the most critical of all of them.

> > In some exceptional cases globals might be
> > unavoidable, but it is trivial to work out that you have to reduce the
> > amount of globals to a minimum, in almost any cases to a number you
> > can use a hand to count with.
>
> That very nice, from a theoretical POV. That's alas just not how it
> works in real life.

A bit utopic, I agree, but it's always possible.

> > And applying the hacks mentioned, why
> > don't you use the m_ convention for globals, and retains the
> > convenience of m_-free variables in your class variable. You use class
> > variable much more often than globals, and in most projects class-
> > level variable is used just as often as local-variable.
>
> The problem is not what *I* (would) do, but how is the existing code.
>
> >>> since knowing where it
> >>> comes from is breaking the encapsulation
> >> Nope, it's knowing what you're doing and how the piece of software at
> >> hand is working. And FWIW, while data hiding is one possible mean of
> >> encapsulation, it's by no way a synonym for encapsulation.
>
> > I agree that knowing an underlying class's implementation is useful
> > (in fact, very useful) but what I'm talking is about should-ness,
> > we
> > shouldn't _need_ to know the underlying implementation,
>
> How can you hope to extend a class without _any_ knowledge of it's
> implementation ?

By knowing its interface is usually enough to extend a class.
(USUALLY!)

> >>> (which, in Python is very
> >>> weakly implemented, which favors flexibility in many cases[1]).
> >>> [1] In Python, it is impossible to create a completely private
> >>> variable, which is the reason why the mental model of these other
> >>> languages doesn't fit Python.
> >> Never heard about the infamous '#define private public' hack in C++ ?
> >> And don't worry, there are also ways to get at so called 'private' vars
> >> in Java.
>
> > No, but it's violating the language's rule.
>
> Nope. It's just making use of some part of the language's rules.

No, its a hack that should be illegal, a kind of abuse of rules.

> > Python OTOH, provides
> > formal ways to got to private vars.
>
> Python doesn't have "private vars" at all.

THAT'S the point. Even __vars and _vars that are supposed to be
(semantic-wise) the private variable of Python is accessible from
outside while still using a formal ways of accessing it (instead of a
hack)

> > It's easy to keep track of globals, as you shouldn't have a
> > lot of them even in a huge project.
>
> Can't you understand that starting a new project afresh is *not* the
> common case ?

But I'm sure even an old, dirty codes wouldn't usually have as much
global as a twenty pages listing of globals. If they do, then just
quit that job.

> >>> As a final note:
> >>> I don't think implicit class reference is superior to explicit class
> >>> reference, neither
> >> ...
>
> > I'm sure you don't believe it since I'm talking on implicit's side,
> > but that's the fact, I just pointed you out that implicits do have its
> > positive side (even if you don't consider them positive in _your_
> > book) but that doesn't meant I believe it is better than the other.
>
> > To clear things up:
> > As a final note:
> > I don't think implicit class reference is superior to explicit class
> > reference, but I don't think the vice versa is true either.
>
> Once again : what classes have to do with this ?
>
> Seriously, how can you hope to be taken seriously when you're obviously
> confusing such important concepts as instance and class ?

I'm struggling with English as of now, so please make me more confused
by having to choose the most proper terms, when it is easy to rule out
what I meant.

This is going to be my last two cents on this topic. I'm tired of this
pointless discussion. We could like a language even though we have
principal differences on how to see the language and how to think
about it.



More information about the Python-list mailing list