[Edu-sig] Topics for CS2

kirby urner kirby.urner at gmail.com
Tue Jan 20 17:45:26 CET 2009


On Tue, Jan 20, 2009 at 1:52 AM, David MacQuigg
<macquigg at ece.arizona.edu> wrote:

<<>>

[ watching Obama motorcade on CRT to my left, typing to LCD on my laptop ]

> There may be some misunderstanding as to my purpose in writing this chapter.  It is not a CS1 introduction to OOP.  I would recommend Zelle or Goldwasser for that.  It was written before those texts, and specifically for engineers who already know some programming, understand very well the need for modular design (no need for motivation of OOP), and who have learned Python up through chapter 18 of Lutz and Ascher (no need to explain __notation__).  It could be an alternative, or a supplement, to the OOP chapters in L&A.  It could also be part of the CS2 class I am now proposing, for students who already know Java or C.
>

Probably I was confusing your "module" and "modular" (cite Perl below)
in that you relate Objects to both, which of course makes sense, in
terms of encapsulated name spaces.  Classes, unlike mere modules, have
progeny, ancestors, spawn multiple instances of themselves without
duplicated more bits than necessary.  You're clear about this
distinction, which I liked.

Your role as the teacher most specializing (customizing) to these
students is a good example of how curriculum writing itself has a
modular aspect, and a subclassing (tiering) from more to less
generalized.  You've got a very specific mix in mind, a certain type
of sophistication.  I likewise have my different biases.

On a scale of 1-10, overriding __del__ in the process of doing
subclass instance counting in a parent class, is like 8 or 9,
especially where you bring __mro__ into it.  So yes, I see your desire
to include arcana.  You're being respectful of your students' already
high level of initiation.

> Some of my choices on what topics to emphasize may be different.  These choices were based on my own experience, which may be different than others, and is certainly different from those who have a less pragmatic approach to languages.  Thus, I have relegated operator over-riding to "advanced topics" because of my perception that it will be used less often than say, static methods, which I have included in my first large example.  I use static methods frequently.  They are now (as of version 2.4) more "pythonic" than when this chapter was written, so I feel I made the right choice.
>

I'm inclined to see + and * in a continuum with [] and (), in that the
former trigger __add__ and __mul__ whereas the latter trigger
__getitem__ and __call__ (or __init__).

In other words "operator over-riding" tends to blend in with all
manner of syntax-invoked method triggering.  The fact that * + and
such used to be segregated as "immutable operators" gets forgotten in
the rough and tumble.

We don't keep quite the same unary / binary fixation either, in that
a.__add__(b) is an act of consumption (a eats b, maybe returns c of
some type), just as is f(1) an act of consumption (swallowing a name).
 a * b is a "munch" operation, in that __mul__ (or __rmul__) is
getting invoked.

In the old days, I tell my students, numbers were stupid, they
couldn't do anything, didn't know anything.  Operators descended from
heaven, like space ships, and caused numbers to be added and
multiplied.

But in Python, everything is a snake, a live object, including numbers
like 1 and 2, and they have knowledge of addition and multiplication
"in their bones" (__ribs__).  When you go 1 * 2, that's not a space
ship, bristling with static methods from on high, from the alien
"knows operations" land, that's invoking an innate ability of the
integer type.

It's a change in gestalt, especially if you start with the old one.

> If my Animals class hierarchy seems artificial, it may be because I tried to illustrate everything I considered important, and do it in the smallest example that would still be clear.  The comparison of classes to modules is not motivated by any affinity to Perl.  What little I know of Perl, I don't like.  Rather, it is part of an answer to the question "Why do we need classes?"  Modules are the closest thing to classes as a unit of encapsulation.  With a simple function, you could generate instances of modules as easily as instantiating objects from classes.  You could even change the language slightly, and allow multiple modules per file.
>

I'm biased against modules because I start at the shell and don't care
about saving any .py files for awhile, have no interest in anything
but the current namespace, like a white board.

My simplest unit is the primitive object, like 1 or 'a', an integer or
string.  We do a dir on both of these, show they "have guts" i.e. are
endowed with methods.

So a number like 2 is a primitive name space, in containing
(encapsulating) a lot of behaviors.

Next up, data structures or collections, things like iterables, like
lists, tuples.  Each supports dot notation to methods, has an
interior.

With functions, you the user start getting more mutable stuff to play
with.  You can set attributes on function objects (a type of
callable).

So by the time we get to classes (types), we already have plenty of
micro-ecologies as examples, all without every saving a module if we
like.  The .py format is for persistence, but it's like entering all
those lines in the shell really fast, even grabbing your own command
line arguments via sys.args.  Modules define name spaces, for sure.
Use dot notation for access.

Also, remember __init__.py is a kind of intercepting at the modular
level in that you're customizing the behavior of 'import'.  In this
sense, there's a class-like aspect to a directory of .py files (to
import is to __init__).  This is not a direct contradiction of any
point you've made, just a hook to the package concept.

> As for making OOP a "revelation", I think that is the wrong way to teach it, especially for non-CS students.  I much prefer to treat it as just another way to encapsulate pieces of a program, a way that is very natural when you are modeling objects in the real world, and can be beneficial even when there is no parallel in the real world - the benefit flowing from the extra flexibility you have in encapsulation.  That is something all engineers will understand.  The "revelation" approach completely turned me off when I first read about OOP in the early 90's.  As a skeptical engineer, I was left with the impression that OOP, polymorphism, etc., was a pile of hype, and there was nothing there I couldn't do in FORTRAN, BASIC, or C.  It wasn't until I started using Python in 2002, that I saw the benefits.  I like that Python doesn't mystify OOP, or push it the way Java does, but just makes it a natural thing to do.
>

Yes, that transition to OOP was upsetting as a paradigm change.  I
went through it on the xBase path, following dBase II (derived from
some JPL language) through its winged migration, fork off to Clipper,
FoxPro, buyout by Microsoft and Borland, FoxPro for Windows, Visual
FoxPro.  The full OO paradigm came across, unlike for the VB crowd,
who didn't get that until later.  This all happened in my little world
*before* I ever wrote a line of Python (plus I got to Java first).
APL was the first computer language I ever learned (loved it), then
FORTRAN (yawn).

However, the real benefit of OO is it hooks to natural language, where
students in human languages, like Sumerian, like Greek, already know
about nouns, verbs and adjectives as parts of speech.  OO is all about
noun.verb() and noun.adjective.  Then with have these little cartoons
about sharing pointers (names) to objects and we're done, they're
convinced there's a real grammar here, a logic, and dive in, happy to
still be in the humanities, no talk of "computer hardware" need apply
(though it's handy to have).  This isn't "hyping" so much as
"bridging".  In industry, it's about town-gown relations, where "town"
is an administrative layer atop cube farmers, and "gown" is IT.  With
ORM, we're finally starting to get admin enrolled in seeing it from
IT's point of view, whereas SQL by itself was just not friendly
enough.

OO makes computing more personable.  To have busy administrators see
'self' as like "a little ego with a personal __dict__ clipboard" is
not "dumbing it down" is rather taking advantage of well-worn patterns
in grammar, leveraging what we've already got.  It's efficient.  Don't
just say "is a", say "I am a".  This isn't just cute, it's a
conceptually smart way to think about a hospital, school or airport
(objects all).  But that's me to busy administrators, not you to
enrolled electrical engineers.  Different audiences = different
patter, we both know that.

> Again, if anyone wants to modify this document for their own purposes, I'll be glad to help.
>
> -- Dave
>

Thank you for making your work public and open source in this way.
You are kind to offer it up for comments and to handle the feedback
graciously, especially given my many different biases.

Kirby


[ Barack Obama about to get sworn in, make his speech -- new president object ]


More information about the Edu-sig mailing list