Re: [Edu-sig] Accessibility to non CS types?
On 5/26/06, Ian Bicking <ianb@colorstudy.com> wrote:
kirby urner wrote:
Not many scientists and engineers learn C or sh any more. ABC's developers had the right idea.
Yes, but they learn C++ and Java and things like that. It's all part of the Algol family, and that family has maintained a strong hold on the mainstream of language syntax.
So you're saying Python is sufficiently accessible because we may still assume scientists and engineers are learning C++ and Java? The fact remains: many academics are throwing up their hands when encountering Python, because they can't make the leap from what they knew before. If my sources are to be believed that is: http://worldgame.blogspot.com/2006/05/coffee-shop-physics.html Kirby
On 5/26/06, kirby urner <kirby.urner@gmail.com> wrote:
So you're saying Python is sufficiently accessible because we may still assume scientists and engineers are learning C++ and Java?
The fact remains: many academics are throwing up their hands when encountering Python, because they can't make the leap from what they knew before.
If I may, I'd like to start with a brief personal history to introduce what I see as two barriers to learning Python for "scientists/academics". I learned FORTRAN in 1977 in a College-level course (pre-University in Quebec). I then went to University to study Physics. During my undergraduate degree, I wrote perhaps 3 (short) computer programs. Still, FORTRAN was easily understood as the stuff I did was mostly numerical work. *The concepts and language I had learned in Physics were easily adaptable in FORTRAN.* While in graduate school, I bought a PC and learned some basic C programming, mostly for fun. Again, I didn't write many programs, and they weren't very long. It did prepare me for what was to come. *The transition from FORTRAN and C was fairly easy, as I could write FORTRAN-style programs in C ;-). As a Physics prof, I wanted to introduce visual tools to my students. After learning about Java Applets, I decided that they were an ideal tool. So, I set up to learn enough Java to write my own applets for physics demo. I wrote a grand total of perhaps 10 applets over the course of 3 years, in between teaching and doing research. This was around 1995. The first barrier I encountered was the "dot" notation. Nowhere did I see it explain separately, as a notational convention, shared by many languages. I thought it was something weird about Java that I would have to learn. Solution: introduce the dot notation totally separately from the other programming syntax. This is something I tried to do in one rur-ple lesson. The second barrier I found was the insistance on categorizing relationships in terms of "is-a" and "has-a". The examples that I would find most natural to look at were often counterintuitive. I remember in particular seeing that programming examples that use the relationships between a circle and an ellipse (or a rectangle and a square) seemed to have the relationship going backwards. Let me give a concrete example explaining inheritance for non-computer scientists. === class Father(object): def __init__(self): self.eye_colour = "blue" def greets(self): print "Hello there!" class Mother(object): def __init__(self): self.eye_colour = "brown" class Child(Mother, Father): pass Sally = Child() print Sally.eye_colour Sally.greets() === The computer scientists in the (virtual) room are probably horrified. A Child is not a Mother, nor a Father and you do not have the proper "is-a" relationship. Yet, I would claim that the above example would be very appropriate for the average person. Sally inherit brown eyes for her mother, not because brown is the dominant gene (as in biology) in this case, but because of the inheritance order. (Interchange Mother and Father in the class Child and Sally will have blue eyes) Words like "inheritance" have been borrowed and adapted by computer scientists for use in their own discipline. They have been borrowed because they were familiar words that conveyed (partly) the concept used in the scientific discipline. They have been adapted because the usual definition was not technically correct. I believe that teaching those concepts should go through the same process: start with the familiar (but slightly incorrect) use of the word, and refine later on. This, btw, is how I use to teach about "work" in physics, so I have some idea about the possible use of this approach.. If I ask my son to carry the groceries bags from the car to the kitchen (essentially at the same horizontal level), he will not believe me if I tell him that he did no work. Yet, he would have done no work (W=0) [against gravity], as every physics textbook would tell you. == As a final point about the personal history: after not having done any programming for about 8 years (and having never written any significant computer program), I decided to learn Python as a hobby with the goal to eventually teach my kids. By then, I had some idea about the dot notation, so this barrier was easily overcome. Python tutorials that I skimmed through did not insist on the importance of "is-a" and "has-a" relationship, so I didn't feel constrained in how I set up the relationships between the classes I wrote. I was free to explore (and make mistakes!) and it was fun! It still is. André
On 5/28/06, Andre Roberge <andre.roberge@gmail.com> wrote:
The first barrier I encountered was the "dot" notation. Nowhere did I see it explain separately, as a notational convention, shared by many languages. I thought it was something weird about Java that I would have to learn.
Solution: introduce the dot notation totally separately from the other programming syntax. This is something I tried to do in one rur-ple lesson.
I very much agree. I even go so far as to phase dot notation into standard mathematics, as another way to express such concepts as mytriangle.A (returns a degree measure). Use dot notation even without reference to any specific computer language, but with reference to the OO paradigm ("math as extensible type system" meme).
The second barrier I found was the insistance on categorizing relationships in terms of "is-a" and "has-a". The examples that I would find most natural to look at were often counterintuitive. I remember in particular seeing that programming examples that use the relationships between a circle and an ellipse (or a rectangle and a square) seemed to have the relationship going backwards.
Let me give a concrete example explaining inheritance for non-computer scientists. === class Father(object): def __init__(self): self.eye_colour = "blue" def greets(self): print "Hello there!"
class Mother(object): def __init__(self): self.eye_colour = "brown"
class Child(Mother, Father): pass
Sally = Child() print Sally.eye_colour Sally.greets() === The computer scientists in the (virtual) room are probably horrified. A Child is not a Mother, nor a Father and you do not have the proper "is-a" relationship. Yet, I would claim that the above example would be very appropriate for the average person. Sally inherit brown eyes for her mother, not because brown is the dominant gene (as in biology) in this case, but because of the inheritance order. (Interchange Mother and Father in the class Child and Sally will have blue eyes)
Words like "inheritance" have been borrowed and adapted by computer scientists for use in their own discipline. They have been borrowed because they were familiar words that conveyed (partly) the concept used in the scientific discipline. They have been adapted because the usual definition was not technically correct. I believe that teaching those concepts should go through the same process: start with the familiar (but slightly incorrect) use of the word, and refine later on.
Yes. This is characteristic of many disciplines; they absorb from ordinary language, but pretty soon stop paying their debt, i.e. stop building on-ramps. Instead, they close off and fortify, so that those not "in the know" sound like fools to the ears of insiders. It's a kind of corruption really. My buddy Ed Applewhite had 'Layman' printed on his business card, kind of a warning that he was nobody's fool (yet he often pretended he couldn't quite follow).
This, btw, is how I use to teach about "work" in physics, so I have some idea about the possible use of this approach.. If I ask my son to carry the groceries bags from the car to the kitchen (essentially at the same horizontal level), he will not believe me if I tell him that he did no work. Yet, he would have done no work (W=0) [against gravity], as every physics textbook would tell you.
Yes, "work" is especially problematic, as it only counts if against the vectors you consider worthy. Irrelevant vectors, though resistent, don't bar the way to "getting the job done" (at least not directly), so you can go down fighting, working like crazy, yet the physicist will tell you you're doing no work whatsoever (like the burglar fumbling with the safe combo is doing no work, if the police come too soon).
== As a final point about the personal history: after not having done any programming for about 8 years (and having never written any significant computer program), I decided to learn Python as a hobby with the goal to eventually teach my kids. By then, I had some idea about the dot notation, so this barrier was easily overcome. Python tutorials that I skimmed through did not insist on the importance of "is-a" and "has-a" relationship, so I didn't feel constrained in how I set up the relationships between the classes I wrote. I was free to explore (and make mistakes!) and it was fun! It still is.
André
Useful autobio. I used to strongly encourage Arthur to switch into this mode from time to time. That's related to your earlier remarks about academic jargons: we're each a walking namespace (Pythonic concept), yet sometimes forget that fact (fish unaware of water). When you go into autobio, people get a better idea of how to factor out various biases you may never be free of in this life (because they define you as a person e.g. "fought in 'Nam for eight years"). Your discussion of how is-a versus has-a easily confounds is quite useful. And I don't think CS has earned the right to claim it has this all nailed down, simply because it requires knowledge of specific knowledge domains to really come up with the most intuitive and simple designs. Physicists need to absorb OO and decide for themselves, not leaving it to CS types to decide how to organize all those bosons and leptons (although I'm sure CERN welcomes the various proposals -- HTTP was invented to help keep track of 'em all). Kirby
Andre Roberge wrote:
... Let me give a concrete example explaining inheritance for non-computer scientists. === class Father(object): ... class Mother(object): ...
class Child(Mother, Father): ... === The computer scientists in the (virtual) room are probably horrified.
Yup. I am aghast. You are using the "built-from" relationship. For yourself this is fine, but pedagogically it leads to very bad habits. The reason I am appalled is that I have maintained code that was written this way, and the mess that results over years of extending from such a shaky foundation is scary.
I would claim the above example would be appropriate for the average person.
I claim this is because you are a scientist, not a mathematician. At Penn, Mathematics was an Art, not a Science (I have great sympathy for that classification). You are in the habit of lying to your students and, next course, telling them "you know, what you learned in the previous course was wrong." This is a reason for people leaving the sciences. Mathematicians tend to try to stick to truths, though they expand the universe they talk about each time. In part this distinction in approaches makes sense because mathematicians who make mistakes are corrected (and scorned) by other mathematicians, while physicists are corrected by reality (and refine their models). Computer Science is a funny field, reached from both mathematics and engineering, so the ideas you see there can come from either of these sides (and cause lovely fights in faculty meetings). I consider CompSci an architecture-like discipline, with mathematics holding the position in CompSci that physics has in architecture.
Words like "inheritance" have been borrowed and adapted by computer scientists for use in their own discipline. While words like "spin" for physicists.... Natural languages seldom have precise definitions, and Computer Science is not close to the worst offender in this respect.
To follow up on Kirby's personal history suggestion: I was a math loving kid by fourth grade, came to computers by accident in summer after 9th grade (in 1966) doing machine language on a vacuum tube computer (LGP-30). The following summer I spent full-time in an NSF-sponsored class at Penn where we a language a week for the summer (two programs per language). The experience gave me a good feel for what a programming language is in general, as well as specifically. I decided around 1972 (after auditing a class or two from Knuth) that I could be either a mediocre mathematician or a very good computer programmer. Twenty years ago I went back to school to attempt a PhD as a teaching union card in CompSci, and got a "drop-out masters." For my research I was trying to do Query Optimization in Object-Oriented Databases, and glanced briefly at Python (the type system did not fit my research needs). After working in compilers and Databases after school, I picked up Python as a way of pursuing a long-running personal project, and had the most fun I've had in a new language in 30 years. --Scott David Daniels Scott.Daniels@Acm.Org
On 5/28/06, Scott David Daniels <Scott.Daniels@acm.org> wrote:
Andre Roberge wrote:
... Let me give a concrete example explaining inheritance for non-computer scientists. === class Father(object): ... class Mother(object): ...
class Child(Mother, Father): ... === The computer scientists in the (virtual) room are probably horrified.
Yup. I am aghast. You are using the "built-from" relationship. For yourself this is fine, but pedagogically it leads to very bad habits.
Agreed ... when it comes to teaching computer science, to future computer scientist. I was reacting to some comment (including reading from a different thread) about how scientists were not able to grasp Python and OOP (to paraphrase the argument). Python is being used in bioinformatics - and some biologists are introduced to computer programming for the first time through their interest in molecular biology, dna structure and the like. I would argue that using an example as the one above would help lowering the barrier to learning, which is something important when people think they can't do it. The reason I am appalled is that I have maintained code that was written
this way, and the mess that results over years of extending from such a shaky foundation is scary.
I would claim the above example would be appropriate for the average person.
I claim this is because you are a scientist, not a mathematician. At Penn, Mathematics was an Art, not a Science (I have great sympathy for that classification). You are in the habit of lying to your students and, next course, telling them "you know, what you learned in the previous course was wrong."
Actually, I'm a theoretical physicist and as such I am closer in my approach to mathematicians than to experimental physicists. When I was teaching, I tried to be careful to mention every time a simplification was made and pointing out that a "deeper truth" would be learned about later. I don't mean to sound defensive: I was truly doing this - but without giving it much more thought than this. However, I think your characterisation of the average physicist's teaching is quite accurate! This is a reason for people leaving the
sciences.
Proof? :-) Mathematicians tend to try to stick to truths, though they
expand the universe they talk about each time. In part this distinction in approaches makes sense because mathematicians who make mistakes are corrected (and scorned) by other mathematicians, while physicists are corrected by reality (and refine their models).
Well put. [snip]
Words like "inheritance" have been borrowed and adapted by computer scientists for use in their own discipline. While words like "spin" for physicists.... Natural languages seldom have precise definitions, and Computer Science is not close to the worst offender in this respect.
And I never meant to imply it was. In fact, I provided a similar example for physics ("work"). To me, the most important thing is to get students (non computer scientists) interested in learning about programming, a bit like you might get someone interested in learning French by taking them to a night club in Montreal: they might pick up some inappropriate sentence constructions (that would appall your average French linguist) which will need to be corrected later ... but chances are they will want to learn French a lot more than if they would have been put in a class taught by your average French linguist. André
Computer Science is a funny field, reached from both mathematics and engineering, so the ideas you see there can come from either of these sides (and cause lovely fights in faculty meetings).
Plus I'm this philo guy, stoked on Wittgenstein, coming to CS through software engineering (Bucciarelli, MIT, is also an LW head I'm told), and providing perhaps unwelcome (at least to established CSers) new underpinnings, in the form of "language games".[1] Philosophy is perennially showing up at the door with new base classes, or more likely with claims that these classes were inherent all along ("nothing really new since Plato"), but are now need of maintenance (updates), perhaps an expensive overhaul. Sophists maybe got a bad name for such door-to-door tactics, but, per Pirsig's 'Zen...', the fact is, many did quality work and what we call "rationality" no doubt owes plenty to these pre-Socratics.[2] This move (coming to CS through philosophy) allows me to compete with the more ordinary logical notations, but with OO, claiming OO is far ahead of the Bertrand Russell era logicians and their propositional descendants when it comes to fulfilling Leibnizian dream of machine executable pattern languages. The only reason OO hasn't made more inroads in K12 by now is mathematicians wall out what they can't find a way to claim credit for (I'm talking about an inferior brand of credit-hungry math head, not top-of-the-line math people like Conway and Guy). Back to inheritance, it's simply not the case that we only inherit because of an is-a relationship. We're always pouring in mixins, or in Java, interfaces, which merely bolt on some common API, such as a dial pad or other familiar widget set (a "dashboard" or "cockpit" of controls and instruments, bells and whistles). One could argue that composition would be the better approach but, in Java anyway, we're interested in the gatekeeping and error trapping functions of type checking. We use it for quality control. Unless you (an instance of some class) support interfaces A, B and C, no way may you pass yourself off as type X. Failure to support some interface'd be like a Wild West sheriff who can't handle a gun. Sure, pistol packing is more a mixin in this design, but it's also part of a sheriff's basic identity. Without that interface, you're just a wannabee. In Python, we don't have interfaces per se (thanks in part to multiple inheritance), and duck typing means we postpone a lot of the critical gatekeeping until runtime. Here you are, your first day on the job as sheriff, in your Wild West saloon (off duty let's say). Someone comes through those swinging door thingys and says "Draw!" and you can't, because you don't have a gun API. An exception is raised. Python programmers get used to catching such exceptions while testing and won't release code that breaks in this way. Java programmers rely on their compiler to catch a lot of such problems even before runtime. We hope they also do a lot of runtime testing, because a green light from the compiler doesn't mean your customer will be satisfied (gee, wouldn't that have been nice?). So there's a different feel to the debugging process in the two languages, as a result of this difference in emphasis. Bruce Eckel talks a lot about this difference.[3] Kirby [1] http://mail.python.org/pipermail/edu-sig/2001-March/001030.html (autobio) [2] http://en.wikipedia.org/wiki/Zen_and_the_Art_of_Motorcycle_Maintenance (pirsig) [3] http://www.mindview.net/WebLog/log-0025 (eckel)
participants (3)
-
Andre Roberge
-
kirby urner
-
Scott David Daniels