Hi Kirby, Many thanks for the thoughtful review of my PythonOOP chapter. I especially like the suggestions to put more emphasis on the role of over-rides in *specializing* the behavior of subclasses, and to at least mention commonly-used CS terms like "polymorphism". I'll keep your suggestions for the next time I do any revisions on this chapter. If you or anyone else wants to make more extensive modifications, target a different audience, etc., I'll be glad to work with you. No reason we can't have multiple versions targeted to different groups. There may be some misunderstanding as to my purpose in writing this chapter. It is not a CS1 introduction to OOP. I would recommend Zelle or Goldwasser for that. It was written before those texts, and specifically for engineers who already know some programming, understand very well the need for modular design (no need for motivation of OOP), and who have learned Python up through chapter 18 of Lutz and Ascher (no need to explain __notation__). It could be an alternative, or a supplement, to the OOP chapters in L&A. It could also be part of the CS2 class I am now proposing, for students who already know Java or C. Some of my choices on what topics to emphasize may be different. These choices were based on my own experience, which may be different than others, and is certainly different from those who have a less pragmatic approach to languages. Thus, I have relegated operator over-riding to "advanced topics" because of my perception that it will be used less often than say, static methods, which I have included in my first large example. I use static methods frequently. They are now (as of version 2.4) more "pythonic" than when this chapter was written, so I feel I made the right choice. If my Animals class hierarchy seems artificial, it may be because I tried to illustrate everything I considered important, and do it in the smallest example that would still be clear. The comparison of classes to modules is not motivated by any affinity to Perl. What little I know of Perl, I don't like. Rather, it is part of an answer to the question "Why do we need classes?" Modules are the closest thing to classes as a unit of encapsulation. With a simple function, you could generate instances of modules as easily as instantiating objects from classes. You could even change the language slightly, and allow multiple modules per file. As for making OOP a "revelation", I think that is the wrong way to teach it, especially for non-CS students. I much prefer to treat it as just another way to encapsulate pieces of a program, a way that is very natural when you are modeling objects in the real world, and can be beneficial even when there is no parallel in the real world - the benefit flowing from the extra flexibility you have in encapsulation. That is something all engineers will understand. The "revelation" approach completely turned me off when I first read about OOP in the early 90's. As a skeptical engineer, I was left with the impression that OOP, polymorphism, etc., was a pile of hype, and there was nothing there I couldn't do in FORTRAN, BASIC, or C. It wasn't until I started using Python in 2002, that I saw the benefits. I like that Python doesn't mystify OOP, or push it the way Java does, but just makes it a natural thing to do. Again, if anyone wants to modify this document for their own purposes, I'll be glad to help. -- Dave At 08:21 PM 1/17/2009 -0800, kirby urner wrote:
Hi David --
I've been looking at your PythonOOP.
Why use classes? All programming aside, I think it's a fairly strong grammatical model of how people think, basically in terms of noun.adjective (data attribute) and noun.verb() (callable method). All talk of computer languages aside, we're very noun-oriented, think in terms of "things" and these things either "are" or "have" attributes and "do" verby stuff.
OOP is about making the language talk about the problem domain, help us forget that under the hood nightmare of chips and registers, needing to allocate memory... all that stupid computer stuff that nobody cares about (smile).
Of course I like your nomenclature of a Cat inheriting from Animal as I'm always tying "hierarchy" back to "zoology" and "taxonomy" as those were original liberal arts tree structures (class hierarchies, kingdoms, domains, namespaces). We like "astral bodies" subclassed into stars, planets, planetoids (like Pluto), and moons. We like "Reptiles" including "Snakes" which of course includes "Pythons" (everything is a python in Python, i.e. an object with a __rib__ cage).
Having a Dog and a Monkey motivates why ancestor classes are important: generic shared stuff, like digestion, might be handled in a shared eat() method, each with its own self.stomach of course. With a younger set (pre-college), those parentheses connote lips, with args as oral intake. In the class of classes though... we give birth.
Per a recent PPUG, I'm these days thinking to use a collections.deque for my digestive tract maybe:
class Animal: def __init__(self): self.stomach = deque() def eat(self, item): self.stomach.append(item) def poop(self): if len(self.stomach)>0: return self.stomach.popleft()
zebra = Animal() zebra.eat('straw') zebra.stomach deque(['straw']) zebra.poop() 'straw' zebra.stomach deque([])
Some will want to say digestive_tract instead of stomach.
Now you can develop your Dog and Cat, inheriting eat() and __init__ from Animal, yet each subclass providing its own __repr__ methods. "Hello world, I'm a Cat at %s" % id(self). Yes, you could make the __repr__ invoke __class__.__name__ and share it -- show that later.... ?
class Animal: def __init__(self): self.stomach = deque() def eat(self, item): self.stomach.append(item) def poop(self): if len(self.stomach)>0: return self.stomach.popleft() def __repr__(self): return "I'm a %s at %s" % (self.__class__.__name__, id(self))
class Dog(Animal): pass
class Cat(Animal): pass
thing1 = Dog() thing2 = Cat() thing1 I'm a Dog at 138223820 thing2 I'm a Cat at 138223852
Students see how inheritance means specializing as you go down the tree (consistent with most introductory treatments).
Your spin seems to tilted towards Cats with not enough other subclasses of Animal to make inheritance seem all that necessary. Your ancestor is mostly for "herding cats" (counting instances), isn't so much a "blueprint" for different subclasses of the Animal idea (aardvark versus eel).
More generally I think using an ancestor class to do instance counting is a little on the "too difficult" side for a core introductory running example. It leads you to introduce an interception of __del__ as your first example of operator over-riding. This seems rather unpythonic, as the __del__ method is rather rarely intercepted, compared to say __set__ and __get__ (per Alex Martelli in this lecture @ Google: http://controlroom.blogspot.com/2009/01/oop-in-hour.html ).
I was also struck by how often you compare classes to modules (modules in Chapter 16 right?), suggesting a Perlish upbringing -- as did your suggestion to use '_' in place of 'self' for "maximal uncluttering" (paraphrase).
When Perl decided to make the leap to OO, it was "the module" that got blessed for this purpose yes? Are you a closet Perl Monger then?
I notice you tend to favor your private terminology and don't resort to the CS vocabulary as often, e.g. "polymorphic" and "polymorphism" are not included. If you're wanting your students to gain fluency with the surrounding literature, I think at least that one deserves more focus, in conjunction with inheritance.
I think the a first class could be all data (like a C struct). Like, this example says a lot:
class Foo: bar = 1
f = Foo() g = Foo() f.bar = 1 f.bar = 2 g.bar 1 f.bar 2 f.__dict__ {'bar': 2} g.__dict__ {} g.bar 1
however once you start adding methods, I don't think postponing the appearance of __init__ for so long is a good idea. I would focus on a Dog and a Cat (or any other two animals), both with __init__ constructors and eat methods, then (as a next step) introduce the Animal superclass as a case of "refactoring" by inheritance -- a good time to mention the "is a" abstraction (a Dog "is an" Animal). Show how you're economizing on code by having an ancestor, make OO seem like a revelation (which it is/was).
I don't think you're alone in finding __ribs__ somewhat intimidating, strange-looking (a hallmark of Python), but I think these should be tackled right away, perhaps with built-in data types first, i.e. that 2 .__add__(2) example. How about a function call ribs() that prints the special names in any object...
import re ex = re.compile("__[a-z]+__")
def ribs(it): return re.findall(ex, " ".join(dir(it)))
I think of __init__ as being triggered by a "birth syntax" i.e. to call a class directly, as in Animal(), is to call for an instance of that class, and that invokes __init__ (__new__ first though).
class Foo: def __init__(self): print("I am born!")
@staticmethod def __call__(): print("What?")
f = Foo() I am born! f() What? Foo.__call__() What?
I like your point that intercepting __add__ is just like "intercepting" an Animal's 'talk' method in a subclass, i.e. interception is happening in both cases, it's just that some names are special (as you say) in being triggered by *syntax* rather than their actual names -- although that form is also usable (as you point out using __add__) -- plus some special names *don't* have special triggering syntax, i.e. just use the __rib__ directly, as in "if __name__ == '__main__':"
I like this section:
Background on Languages - languages poster - progression of languages 1 2 - domain-specific (FORTRAN, COBOL) - general-purpose, high-level (C) - one step above assembler - problem was complexity, one big pile of data - object-oriented (C++) - data now contained within independent objects - human oriented (Java, Python) - garbage collection - dynamic typing
... and your focus on Mandelbrot / fractals.
Lots of substance.
Kirby
On Thu, Jan 15, 2009 at 3:09 PM, David MacQuigg <macquigg@ece.arizona.edu> wrote:
I'm putting together a list of topics for a proposed course entitled "Programming for Scientists and Engineers". See the link to CS2 under http://ece.arizona.edu/~edatools/index_classes.htm. This is intended as a follow-on to an introductory course in either Java or C, so the students will have some exposure to programming, but the Java students won't know machine-level programming, and the C students won't have any OOP. For the proposed course, we will use the example programs as a way to introduce these topics.
As you can see from the very few links to completed examples, this is just a start. Many of the links are only to discussions on this list, and I really appreciate the suggestions I have received so far. Also, it is far from a representative survey or optimum sample, rather just a random sample of topics that I found interesting. Some of the topics may be out of reach for students at this level, but that is the challenge. Figure out a way to present a complex or theoretically difficult topic in a way that is simple and intuitive and will whet the students appetite for further study.
Additional suggestions are welcome.
-- Dave
_______________________________________________ Edu-sig mailing list Edu-sig@python.org http://mail.python.org/mailman/listinfo/edu-sig
On Tue, Jan 20, 2009 at 1:52 AM, David MacQuigg <macquigg@ece.arizona.edu> wrote: <<>> [ watching Obama motorcade on CRT to my left, typing to LCD on my laptop ]
There may be some misunderstanding as to my purpose in writing this chapter. It is not a CS1 introduction to OOP. I would recommend Zelle or Goldwasser for that. It was written before those texts, and specifically for engineers who already know some programming, understand very well the need for modular design (no need for motivation of OOP), and who have learned Python up through chapter 18 of Lutz and Ascher (no need to explain __notation__). It could be an alternative, or a supplement, to the OOP chapters in L&A. It could also be part of the CS2 class I am now proposing, for students who already know Java or C.
Probably I was confusing your "module" and "modular" (cite Perl below) in that you relate Objects to both, which of course makes sense, in terms of encapsulated name spaces. Classes, unlike mere modules, have progeny, ancestors, spawn multiple instances of themselves without duplicated more bits than necessary. You're clear about this distinction, which I liked. Your role as the teacher most specializing (customizing) to these students is a good example of how curriculum writing itself has a modular aspect, and a subclassing (tiering) from more to less generalized. You've got a very specific mix in mind, a certain type of sophistication. I likewise have my different biases. On a scale of 1-10, overriding __del__ in the process of doing subclass instance counting in a parent class, is like 8 or 9, especially where you bring __mro__ into it. So yes, I see your desire to include arcana. You're being respectful of your students' already high level of initiation.
Some of my choices on what topics to emphasize may be different. These choices were based on my own experience, which may be different than others, and is certainly different from those who have a less pragmatic approach to languages. Thus, I have relegated operator over-riding to "advanced topics" because of my perception that it will be used less often than say, static methods, which I have included in my first large example. I use static methods frequently. They are now (as of version 2.4) more "pythonic" than when this chapter was written, so I feel I made the right choice.
I'm inclined to see + and * in a continuum with [] and (), in that the former trigger __add__ and __mul__ whereas the latter trigger __getitem__ and __call__ (or __init__). In other words "operator over-riding" tends to blend in with all manner of syntax-invoked method triggering. The fact that * + and such used to be segregated as "immutable operators" gets forgotten in the rough and tumble. We don't keep quite the same unary / binary fixation either, in that a.__add__(b) is an act of consumption (a eats b, maybe returns c of some type), just as is f(1) an act of consumption (swallowing a name). a * b is a "munch" operation, in that __mul__ (or __rmul__) is getting invoked. In the old days, I tell my students, numbers were stupid, they couldn't do anything, didn't know anything. Operators descended from heaven, like space ships, and caused numbers to be added and multiplied. But in Python, everything is a snake, a live object, including numbers like 1 and 2, and they have knowledge of addition and multiplication "in their bones" (__ribs__). When you go 1 * 2, that's not a space ship, bristling with static methods from on high, from the alien "knows operations" land, that's invoking an innate ability of the integer type. It's a change in gestalt, especially if you start with the old one.
If my Animals class hierarchy seems artificial, it may be because I tried to illustrate everything I considered important, and do it in the smallest example that would still be clear. The comparison of classes to modules is not motivated by any affinity to Perl. What little I know of Perl, I don't like. Rather, it is part of an answer to the question "Why do we need classes?" Modules are the closest thing to classes as a unit of encapsulation. With a simple function, you could generate instances of modules as easily as instantiating objects from classes. You could even change the language slightly, and allow multiple modules per file.
I'm biased against modules because I start at the shell and don't care about saving any .py files for awhile, have no interest in anything but the current namespace, like a white board. My simplest unit is the primitive object, like 1 or 'a', an integer or string. We do a dir on both of these, show they "have guts" i.e. are endowed with methods. So a number like 2 is a primitive name space, in containing (encapsulating) a lot of behaviors. Next up, data structures or collections, things like iterables, like lists, tuples. Each supports dot notation to methods, has an interior. With functions, you the user start getting more mutable stuff to play with. You can set attributes on function objects (a type of callable). So by the time we get to classes (types), we already have plenty of micro-ecologies as examples, all without every saving a module if we like. The .py format is for persistence, but it's like entering all those lines in the shell really fast, even grabbing your own command line arguments via sys.args. Modules define name spaces, for sure. Use dot notation for access. Also, remember __init__.py is a kind of intercepting at the modular level in that you're customizing the behavior of 'import'. In this sense, there's a class-like aspect to a directory of .py files (to import is to __init__). This is not a direct contradiction of any point you've made, just a hook to the package concept.
As for making OOP a "revelation", I think that is the wrong way to teach it, especially for non-CS students. I much prefer to treat it as just another way to encapsulate pieces of a program, a way that is very natural when you are modeling objects in the real world, and can be beneficial even when there is no parallel in the real world - the benefit flowing from the extra flexibility you have in encapsulation. That is something all engineers will understand. The "revelation" approach completely turned me off when I first read about OOP in the early 90's. As a skeptical engineer, I was left with the impression that OOP, polymorphism, etc., was a pile of hype, and there was nothing there I couldn't do in FORTRAN, BASIC, or C. It wasn't until I started using Python in 2002, that I saw the benefits. I like that Python doesn't mystify OOP, or push it the way Java does, but just makes it a natural thing to do.
Yes, that transition to OOP was upsetting as a paradigm change. I went through it on the xBase path, following dBase II (derived from some JPL language) through its winged migration, fork off to Clipper, FoxPro, buyout by Microsoft and Borland, FoxPro for Windows, Visual FoxPro. The full OO paradigm came across, unlike for the VB crowd, who didn't get that until later. This all happened in my little world *before* I ever wrote a line of Python (plus I got to Java first). APL was the first computer language I ever learned (loved it), then FORTRAN (yawn). However, the real benefit of OO is it hooks to natural language, where students in human languages, like Sumerian, like Greek, already know about nouns, verbs and adjectives as parts of speech. OO is all about noun.verb() and noun.adjective. Then with have these little cartoons about sharing pointers (names) to objects and we're done, they're convinced there's a real grammar here, a logic, and dive in, happy to still be in the humanities, no talk of "computer hardware" need apply (though it's handy to have). This isn't "hyping" so much as "bridging". In industry, it's about town-gown relations, where "town" is an administrative layer atop cube farmers, and "gown" is IT. With ORM, we're finally starting to get admin enrolled in seeing it from IT's point of view, whereas SQL by itself was just not friendly enough. OO makes computing more personable. To have busy administrators see 'self' as like "a little ego with a personal __dict__ clipboard" is not "dumbing it down" is rather taking advantage of well-worn patterns in grammar, leveraging what we've already got. It's efficient. Don't just say "is a", say "I am a". This isn't just cute, it's a conceptually smart way to think about a hospital, school or airport (objects all). But that's me to busy administrators, not you to enrolled electrical engineers. Different audiences = different patter, we both know that.
Again, if anyone wants to modify this document for their own purposes, I'll be glad to help.
-- Dave
Thank you for making your work public and open source in this way. You are kind to offer it up for comments and to handle the feedback graciously, especially given my many different biases. Kirby [ Barack Obama about to get sworn in, make his speech -- new president object ]
On Tue, Jan 20, 2009 at 8:45 AM, kirby urner <kirby.urner@gmail.com> wrote:
On Tue, Jan 20, 2009 at 1:52 AM, David MacQuigg <macquigg@ece.arizona.edu> wrote:
<<>>
[ watching Obama motorcade on CRT to my left, typing to LCD on my laptop ]
There may be some misunderstanding as to my purpose in writing this chapter. It is not a CS1 introduction to OOP. I would recommend Zelle or Goldwasser for that. It was written before those texts, and specifically for engineers who already know some programming, understand very well the need for modular design (no need for motivation of OOP), and who have learned Python up through chapter 18 of Lutz and Ascher (no need to explain __notation__). It could be an alternative, or a supplement, to the OOP chapters in L&A. It could also be part of the CS2 class I am now proposing, for students who already know Java or C.
Probably I was confusing your "module" and "modular" (cite Perl below) in that you relate Objects to both, which of course makes sense, in terms of encapsulated name spaces. Classes, unlike mere modules, have progeny, ancestors, spawn multiple instances of themselves without duplicated more bits than necessary. You're clear about this distinction, which I liked.
Your role as the teacher most specializing (customizing) to these students is a good example of how curriculum writing itself has a modular aspect, and a subclassing (tiering) from more to less generalized. You've got a very specific mix in mind, a certain type of sophistication. I likewise have my different biases.
On a scale of 1-10, overriding __del__ in the process of doing subclass instance counting in a parent class, is like 8 or 9, especially where you bring __mro__ into it. So yes, I see your desire to include arcana. You're being respectful of your students' already high level of initiation.
Some of my choices on what topics to emphasize may be different. These choices were based on my own experience, which may be different than others, and is certainly different from those who have a less pragmatic approach to languages. Thus, I have relegated operator over-riding to "advanced topics" because of my perception that it will be used less often than say, static methods, which I have included in my first large example. I use static methods frequently. They are now (as of version 2.4) more "pythonic" than when this chapter was written, so I feel I made the right choice.
I'm inclined to see + and * in a continuum with [] and (), in that the former trigger __add__ and __mul__ whereas the latter trigger __getitem__ and __call__ (or __init__).
In other words "operator over-riding" tends to blend in with all manner of syntax-invoked method triggering. The fact that * + and such used to be segregated as "immutable operators" gets forgotten in the rough and tumble.
We don't keep quite the same unary / binary fixation either, in that a.__add__(b) is an act of consumption (a eats b, maybe returns c of some type), just as is f(1) an act of consumption (swallowing a name). a * b is a "munch" operation, in that __mul__ (or __rmul__) is getting invoked.
In the old days, I tell my students, numbers were stupid, they couldn't do anything, didn't know anything. Operators descended from heaven, like space ships, and caused numbers to be added and multiplied.
But in Python, everything is a snake, a live object, including numbers like 1 and 2, and they have knowledge of addition and multiplication "in their bones" (__ribs__). When you go 1 * 2, that's not a space ship, bristling with static methods from on high, from the alien "knows operations" land, that's invoking an innate ability of the integer type.
I made a fun cartoon out of the above: http://mybizmo.blogspot.com/2009/01/transitions-of-power.html My more technical argument, about how OO changes our view of operators, in this older paper, where my nouns are quaternions: http://www.4dsolutions.net/ocn/oopalgebra.html Kirby
It's a change in gestalt, especially if you start with the old one.
participants (2)
-
David MacQuigg
-
kirby urner