
At 08:45 AM 1/20/2009 -0800, kirby urner wrote:
On Tue, Jan 20, 2009 at 1:52 AM, David MacQuigg wrote:
There may be some misunderstanding as to my purpose in writing this chapter. ...
On a scale of 1-10, overriding __del__ in the process of doing subclass instance counting in a parent class, is like 8 or 9, especially where you bring __mro__ into it. So yes, I see your desire to include arcana. You're being respectful of your students' already high level of initiation.
All this is in the "Advanced Topics" section at the end. I probably should make that a separate chapter to avoid any distraction. The part I expect students at the CS2 level to understand is in the first 12 pages, and that should be free of arcana. ...
But in Python, everything is a snake, a live object, including numbers like 1 and 2, and they have knowledge of addition and multiplication "in their bones" (__ribs__). When you go 1 * 2, that's not a space ship, bristling with static methods from on high, from the alien "knows operations" land, that's invoking an innate ability of the integer type.
It's a change in gestalt, especially if you start with the old one.
...
As for making OOP a "revelation", I think that is the wrong way to teach it, especially for non-CS students. I much prefer to treat it as just another way to encapsulate pieces of a program, a way that is very natural when you are modeling objects in the real world, and can be beneficial even when there is no parallel in the real world - the benefit flowing from the extra flexibility you have in encapsulation. That is something all engineers will understand. The "revelation" approach completely turned me off when I first read about OOP in the early 90's. As a skeptical engineer, I was left with the impression that OOP, polymorphism, etc., was a pile of hype, and there was nothing there I couldn't do in FORTRAN, BASIC, or C. It wasn't until I started using Python in 2002, that I saw the benefits. I like that Python doesn't mystify OOP, or push it the way Java does, but just makes it a natural thing to do.
Yes, that transition to OOP was upsetting as a paradigm change. I went through it on the xBase path, following dBase II (derived from some JPL language) through its winged migration, fork off to Clipper, FoxPro, buyout by Microsoft and Borland, FoxPro for Windows, Visual FoxPro. The full OO paradigm came across, unlike for the VB crowd, who didn't get that until later. This all happened in my little world *before* I ever wrote a line of Python (plus I got to Java first). APL was the first computer language I ever learned (loved it), then FORTRAN (yawn).
However, the real benefit of OO is it hooks to natural language, where students in human languages, like Sumerian, like Greek, already know about nouns, verbs and adjectives as parts of speech. OO is all about noun.verb() and noun.adjective. Then with have these little cartoons about sharing pointers (names) to objects and we're done, they're convinced there's a real grammar here, a logic, and dive in, happy to still be in the humanities, no talk of "computer hardware" need apply (though it's handy to have). This isn't "hyping" so much as "bridging". In industry, it's about town-gown relations, where "town" is an administrative layer atop cube farmers, and "gown" is IT. With ORM, we're finally starting to get admin enrolled in seeing it from IT's point of view, whereas SQL by itself was just not friendly enough.
OO makes computing more personable. To have busy administrators see 'self' as like "a little ego with a personal __dict__ clipboard" is not "dumbing it down" is rather taking advantage of well-worn patterns in grammar, leveraging what we've already got. It's efficient. Don't just say "is a", say "I am a". This isn't just cute, it's a conceptually smart way to think about a hospital, school or airport (objects all). But that's me to busy administrators, not you to enrolled electrical engineers. Different audiences = different patter, we both know that.
I think many of us who grew up with the "old gestalt" had a hard time with the transition to OOP. Maybe the old gestalt still includes administrators, but I would be surprised if it includes high-school students. It did include engineering students a few years ago, when we had a junior-level course in OOP taught in Smalltalk, so the students would be forced to break out of their old habits and do everything in the style of OOP. We still have that course, but now the pre-requisite is a course in Java, and it has morphed to more software-engineering and less pounding on the necessity of OOP. It seems that the "new gestalt" should not require breaking old habits. Objects should be introduced early, the noun.verb syntax is simple, and there is little need for motivation, revelation, or elaborate analogies by the time the students get to crafting their own objects. That seems to be the approach taken by newer texts, and it would have been beneficial for me also. It really wasn't the difficulty of the OOP paradigm that held us back, but the awkwardness of the languages, which perhaps created a need to force-feed and over-sell the idea of OOP, which turned off pragmatic folks like myself. We still have that awkwardness in our intro Java courses, but it seems to have been accepted as a minor inconvenience ("Just memorize this stuff, we'll explain it later."). I put together a sequence of Python examples for the transition to OOP, and added them to my collection for CS2 ( http://ece.arizona.edu/~edatools/index_classes.htm ). The emphasis is practical benefits, not any attempt to sell OOP. The principle is that you chose the level of structure that is right for your problem, bearing in mind that real-world problems tend to grow, so a little extra structure at the beginning often saves time in the end. The final example shows how Python can do everything that Java does, including private parts, and interface definitions.
Thank you for making your work public and open source in this way. You are kind to offer it up for comments and to handle the feedback graciously, especially given my many different biases.
And thank you for your constructive criticism, which is always appreciated. -- Dave

I think many of us who grew up with the "old gestalt" had a hard time with the transition to OOP. Maybe the old gestalt still includes administrators, but I would be surprised if it includes high-school students. It did include engineering students a few years ago, when we had a junior-level course in OOP taught in Smalltalk, so the students would be forced to break out of their old habits and do everything in the style of OOP. We still have that course, but now the pre-requisite is a course in Java, and it has morphed to more software-engineering and less pounding on the necessity of OOP.
Yes, reminds me of the New Math of the 1960s and the subsequent backlash, teachers bitter about having their authority undermined by all this newfangled set notation, unions and intersections of this and that, a lotta b.s. some of 'em thought, made 'em hate Sputnik, all that bleeping noise. But then a lot from that era did stick around, a lot of reforms in early math curriculum writing (e.g. Dolciani). Newer text books helped glue together our current algebra thru calculus pipeline, which colleges still treat as a superhighway through admissions, aka ye old "Calculus Mountain" of Pycon 2008. Besides just OO, lambda calculus made the leap to machine executable, with its functional concepts ("big lambda"). Then we've got Haskell and like that, so it's not like OO is all alone in this brave new world of executable logics, anticipated by Leibniz, Ada, and Dr. Vannevar Bush, the latter writing about hypertext and search engines in the 1940s if not in those terms. Left behind on the far side of the digital divide were the calculator crowd with their "white board" logics -- stuff that only "works" on paper. Mathematica rescued a lot of that stuff, forcing long overdue house-cleaning. In the old typography, f(x) is quite ambiguous as it might as well mean f * x or f eats x. In Python, those aren't so different, in that __mul__ eats a name as its argument. a * b and a(b) are really quite similar, in that () triggers __call__, just another __rib__, like __mul__. We think different now, thanks not just to Smalltalk, but to Apple and NeXT, all helping to make OO real in the workplace. High schools languish because there's this "you can't make me attitude" on the part of professionals, who perceive evolutionary pressures as "other people telling me what to do". Pride gets in the way. Not saying this is a unique case. We all tend to get stuck in the mud of our own generation. That's why we've each got this TTL I suppose, like in tcp/ip -- a sensible design. As a result of this languishing, many future engineers must do their serious studying after hours, in cyberspace, where we yak about SQL, RSA, Unicode, Ruby... -- all "the good stuff" they're likely not getting from Portland Public (yet).
It seems that the "new gestalt" should not require breaking old habits. Objects should be introduced early, the noun.verb syntax is simple, and there is little need for motivation, revelation, or elaborate analogies by the time the students get to crafting their own objects. That seems to be the approach taken by newer texts, and it would have been beneficial for me also. It really wasn't the difficulty of the OOP paradigm that held us back, but the awkwardness of the languages, which perhaps created a need to force-feed and over-sell the idea of OOP, which turned off pragmatic folks like myself. We still have that awkwardness in our intro Java courses, but it seems to have been accepted as a minor inconvenience ("Just memorize this stuff, we'll explain it later.").
An early algebra class, lesson plan: Ms. Jones shows her students how to use their totatives function to populate a list with Modulo Numbers. Their "closure checker", written as a Python generator, inspires discussion about groups. Use a prime modulus next time, talk about fields. Source code: from random import choice def gcd(a,b): while b: a, b = b, a % b return a def totatives(n): return [ x for x in range(1, n) if gcd(x, n) == 1] class Modulo: m = 12 def __init__(self, n): self.n = n % Modulo.m def __mul__(self, other): return Modulo(self.n * other.n) def __eq__(self, other): return self.n == other.n def __repr__(self): return str(self.n) def closure_checker(pool): while True: a = choice(pool) b = choice(pool) result = a * b assert result in pool yield "%s * %s == %s" % (a, b, result)
from algebra import * p = totatives(12) thegroup = [Modulo(x) for x in p] thegroup [1, 5, 7, 11] g = closure_checker(thegroup) next(g) '5 * 7 == 11' next(g) '5 * 7 == 11' next(g) '11 * 11 == 1' next(g) '5 * 5 == 1' next(g) '7 * 7 == 1' next(g) '11 * 7 == 5'
The Modulo class is somewhat subtle in that the self.__dict__ n values are simply integer type, so whereas the __mul__ intercepts, it doesn't do so recursively i.e. it's Modulo(self.n * other.n) or int * int for an arg, not Modulo(self * other) -- which goes into a recursive tailspin. We'll use these same Modulo Numbers to talk about their powers (** operator, or __pow__), assert Euler's theorem.
Thank you for making your work public and open source in this way. You are kind to offer it up for comments and to handle the feedback graciously, especially given my many different biases.
And thank you for your constructive criticism, which is always appreciated.
-- Dave
Let's hope you continue getting a lot of useful feedback from a variety of viewpoints. Kirby
participants (2)
-
David MacQuigg
-
kirby urner