tough-to-explain Python

Simon Forman sajmikins at gmail.com
Thu Jul 9 01:05:57 EDT 2009


(I wanted to reply to a few messages in one post so I quoted them all
below. Let me know if this is bad etiquette.)

On Jul 8, 8:23 am, kj <no.em... at please.post> wrote:
> In <5f0a2722-45eb-468c-b6b2-b7bb80ae5... at q11g2000yqi.googlegroups.com> Simon Forman <sajmik... at gmail.com> writes:
>
> >Frankly, I'm of the impression that it's a mistake not to start
> >teaching programming with /the bit/ and work your way up from there.
> >I'm not kidding. I wrote a (draft) article about this: "Computer
> >Curriculum"http://docs.google.com/View?id=dgwr777r_31g4572gp4
> >I really think the only good way to teach computers and programming is
> >to start with a bit, and build up from there. "Ontology recapitulates
> >phylogeny"
>
> I happen to be very receptive to this point of view.  I had the
> benefit of that sort of training (one of the first computer courses
> I took started, believe it or not, with Turing machines, through
> coding in machine language, and compiler theory, and all the way
> up to dabbling with Unix!), and I suspect that the reason it is
> sometimes difficult for me to explain even relatively simple-looking
> things to others is that I have this background that I unconsciously,
> and incorrectly, take for granted in others...  There is this

Yes!  Once the concepts become so familiar that you call them
"intuitive" it seems to be very difficult to remember what they were
like before.

Something like "a = b" becomes "obvious" only after you've
internalized the preceding concepts.


> persistent idea "out there" that programming is a very accessible
> skill, like cooking or gardening, anyone can do it, and even profit
> from it, monetarily or otherwise, etc., and to some extent I am

Programming is not like any other human activity.  I've been reading
some of Prof. Dijkstra's EWDs in the last few days.  In one [1] he
says, "automatic computers embody not only one radical novelty but two
of them", to wit: First, the huge scales that must be understood,
"from a bit to a few hundred megabytes, from a microsecond to a half
an hour of computing"; and second, "that the automatic computer is our
first large-scale digital device" which our until-now overwhelmingly
analog experience does not prepare us to deal with well.

He talks about how "when all is said and done, the only thing
computers can do for us is to manipulate symbols and produce results
of such manipulations" and he emphasises the "uninterpreted" nature of
mechanical symbol manipulation, i.e. that the machine is doing it
mindlessly.

Dijkstra[1]: "It is true that the student that has never manipulated
uninterpreted formulae quickly realizes that he is confronted with
something totally unlike anything he has ever seen before. But
fortunately, the rules of manipulation are in this case so few and
simple that very soon thereafter he makes the exciting discovery that
he is beginning to master the use of a tool that, in all its
simplicity, gives him a power that far surpasses his wildest
dreams." [1]


> actively contributing to this perception by teaching this course
> to non-programmers (experimental biologists to be more precise),

Experimental biologists?  Well that's probably harmless.  Mostly
harmless.

> but maybe this idea is not entirely true...  Maybe, to get past
> the most amateurish level, one has to, one way or another, come
> face-to-face with bits, compilers, algorithms, and all the rest
> that real computer scientists learn about in their formal training...
>
> kj

If you're never exposed to that constellation of concepts that
underpins "mechanical symbol manipulation" you are adrift in a sea
("c", ha ha) of abstractions.

However, if you /are/ exposed to the "so few and simple" rules of
manipulation the gates (no pun intended) to the kingdom are thrown
wide.


On Jul 8, 9:10 am, Steven D'Aprano <st... at REMOVE-THIS-
cybersource.com.au> wrote:
> On Wed, 08 Jul 2009 12:23:50 +0000, kj wrote:
> > I happen to be very receptive to this point of view.
> [...]
> > There is this persistent idea "out there" that
> > programming is a very accessible skill, like cooking or gardening,
> > anyone can do it, and even profit from it, monetarily or otherwise,
> > etc., and to some extent I am actively contributing to this perception
> > by teaching this course to non-programmers (experimental biologists to
> > be more precise), but maybe this idea is not entirely true...
>
> There is some evidence that 30-60% of people simply cannot learn to
> program, no matter how you teach them:
>
> http://www.codinghorror.com/blog/archives/000635.html
> http://www.cs.mdx.ac.uk/research/PhDArea/saeed/

Thank you! That's exactly the paper that prompted me to write the
article I mentioned. (Now I don't have to go find the link myself.
Win!)

I don't buy it: I believe strongly that any normal person can learn to
program, to manipulate symbols to create formulae that guide the
machine in its uninterpreted symbol manipulation.

I find it significant that in the paper [2] they say, "Formal logical
proofs, and therefore programs – formal logical proofs that particular
computations are possible, expressed in a formal system called a
programming language – are utterly meaningless. To write a computer
program you have to come to terms with this, to accept that whatever
you might want the program to mean, the machine will blindly follow
its meaningless rules and come to some meaningless conclusion. In the
test the consistent group showed a pre-acceptance of this fact: they
are capable of seeing mathematical calculation problems in terms of
rules, and can follow those rules wheresoever they may lead. The
inconsistent group, on the other hand, looks for meaning where it is
not."

In other words the people who don't understand computers, don't
understand computers.

I think that "first hump" people can become "second hump" people but
that it requires teaching them the foundations first, not confronting
them with such incredible novelties as "a = b" and saying in effect,
"here you go buddy, sink or swim."

Quoting Dijkstra again [1]: "Before we part, I would like to invite
you to consider the following way of doing justice to computing's
radical novelty in an introductory programming course.

"On the one hand, we teach what looks like the predicate calculus, but
we do it very differently from the philosophers. In order to train the
novice programmer in the manipulation of uninterpreted formulae, we
teach it more as boolean algebra, familiarizing the student with all
algebraic properties of the logical connectives. To further sever the
links to intuition, we rename the values {true, false} of the boolean
domain as {black, white}.

"On the other hand, we teach a simple, clean, imperative programming
language, with a skip and a multiple assignment as basic statements,
with a block structure for local variables, the semicolon as operator
for statement composition, a nice alternative construct, a nice
repetition and, if so desired, a procedure call. To this we add a
minimum of data types, say booleans, integers, characters and strings.
The essential thing is that, for whatever we introduce, the
corresponding semantics is defined by the proof rules that go with
it."

Imagine my surprise: he's saying (with immensely greater brilliance
and eloquence) much what I said in my little article. The major
difference from what he's outlined is that I think the students should
implement the imperative programming language themselves in Forth, but
the gist is the same.


> I'm sympathetic to the idea, but not entirely convinced. Perhaps the
> problem isn't with the students, but with the teachers, and the
> languages:
>
> http://www.csse.monash.edu.au/~damian/papers/PDF/SevenDeadlySins.pdf
>
> (My money is that it's a little of both.)

Hmm, that paper contains some good insights IMO, but I think they're
still missing the big picture, so to speak.

Really I suspect it's a case of "Programming languages considered
harmful."

The core abstractions of [mechanical] computation are just not that
complicated.  You can teach them to anybody in about a half an hour,
drunk. I have.

After that, if they're interested, there is a smooth easy path to
"higher" abstractions: parsing, compiling, tree traversal and
transformation.  (It is said that possession is 9/10s of the law, in
the same vein I would claim parsing is 9/10s of computer programming.)

I am beginning to suspect that concrete static (in the sense of
"standard" language specifications) languages are part of the
problem.  Everyone gets so caught up in programming via languages that
you get, well, people trying to teach "Computer Programming" as if it
were only necessary to grok a language, rather than grokking /symbol
manipulation/ itself.

(Did you read that last paragraph and think, "Well how the heck else
are you supposed to program a computer if not in a computer
language?"?  If so, well, that is kind of my point.)


> > Maybe, to
> > get past the most amateurish level, one has to, one way or another, come
> > face-to-face with bits, compilers, algorithms, and all the rest that
> > real computer scientists learn about in their formal training...
>
> The "No True Scotsman" fallacy.
>
> There's nothing amateurish about building software applications that
> work, with well-designed interfaces and a minimum of bugs, even if you've
> never heard of Turing Machines.
>
> --
> Steven

I beg to differ. I recall a conversation with a co-worker who had
"learned" to program using PHP.  Another co-worker and I were trying
to convince him that there was a good reason to differentiate between
hash tables and arrays. He didn't even know that they were different
"things".

I remember telling him, "between the metal and the desktop there is
nothing but layers of abstraction.  We use different names precisely
because of different behaviours."

He made "well-designed interfaces", but "amateurish" is about the
nicest thing I would have called him.

As for "a minimum of bugs"... sigh. The "minimum of bugs" is zero, if
you derive your "uninterpreted formulae" /correctly/.  Deriving
provably correct "programs" should be what computer science and
computer education are all about (not "java vocational training" as
Alan Kay once decried.)

Again with Dijkstra[3]: "The prime paradigma of the pragmatic designer
is known as "poor man's induction", i.e. he believes in his design as
long as "it works", i.e. until faced with evidence to the contrary.
(He will then "fix the design".) The scientific designer, however,
believes in his design because he understands why it will work under
all circumstances. The transition from pragmatic to scientific design
would indeed be a drastic change within the computer industry."

"Obviously no errors" is the goal to strive for, and I am comfortable
calling anyone an amateur who prefers "no obvious errors."  (Actually
that's a little harsh on the amateurs, "ama" meaning love, "amateur"
is one who does something for love of it.)


On Jul 8, 9:27 am, kj <no.em... at please.post> wrote:
> In <5f0a2722-45eb-468c-b6b2-b7bb80ae5... at q11g2000yqi.googlegroups.com> Simon Forman <sajmik... at gmail.com> writes:
>
> >I'm not kidding. I wrote a (draft) article about this: "Computer
> >Curriculum"http://docs.google.com/View?id=dgwr777r_31g4572gp4
>
> Very cool.
>
> kj

Hey, thank you!


On Jul 8, 9:47 am, Paul Moore <p.f.mo... at gmail.com> wrote:
> 2009/7/8 kj <no.em... at please.post>:
>
> > There is this
> > persistent idea "out there" that programming is a very accessible
<snip>
> > that real computer scientists learn about in their formal training...
>
> Look at it another way. Experimental biologists don't want to program,
> they want to use computers to do experimental biology. It's a tool,
> and they (quite reasonably) don't *care* about robustness,
> portability, etc. Or even about programming, to be honest.

I'd say it's just the opposite: to "use computers to do experimental
biology" they want to instruct that machine to manipulate their
(meaningful to them but meaningless to it) symbols in useful ways.
This is nothing more or less than programming.

The fact that they need to learn all sorts of details of a programming
language to do that is NOT because they can't grok programming. It's
because computer scientists have put too many layers of abstraction on
top of the "pure" symbol manipulation and then forgotten what they
have done.

I have a very nice book "Introduction to Programming and Problem
Solving with Pascal" that I picked up for $0.50 at a used bookstore
not long ago.  It says, right on page 201, in the chapter on "Running,
Debugging, and Testing Programs":

"One of the nice features of programming in a high-level language like
Pascal is that it can be done with almost a total lack of
understanding of what a computer is and how it actually operates.
[...] There is no reason why someone who wants to write a computer
program should have to understand the electronic circuitry of a
computer, any more than someone learning to drive a car should have to
understand how the internal combustion engine works."

I think that's exactly wrong. What you're doing with computers doesn't
change from the bit to the high-level language.  It's all symbol
manipulation according to the same set of rules, all the way up.  The
elaboration becomes involved as you go up but the process does not
change qualitatively.


> In the context of the original question, it's entirely reasonable (in
> my view) to tell this audience "if the code does something weird you
> don't understand, either ignore it and find another way or dig into
> the manuals and experiment if you care". They'd very quickly find a =
> a + b as a less confusing alternative to a += b. (As has been pointed
> out earlier, to some extent a += b is quite an advanced construct -
> after all, it's essentially an optimisation of a = a + b).

On that point I completely agree. The language is getting in the way
of the programming.

> Biologists don't expect me to understand their discipline before I can
> plant seeds in my garden, after all. (And when I do plant seeds, I
> usually get far more surprising results than I could get from a += b
> :-))
>
> Paul.

The discipline of programming is different than biology. It's
incredibly simple yet profound if it's taught as what it is, i.e.
automatic symbol manipulation.

No scientist is a stranger to logic and reasoned argument.  They
shouldn't be strangers to telling their mechanical brains what to
"reason" about.  Learning to program should be /easy/ for someone who
basically already gets it.

Wow, long post.

(Oh, and, er, it's ONTOGENY not Ontology that recapitulates phylogeny.
Heh. My bad.)

[1] "On the cruelty of really teaching computing science"
http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html


[2] "The camel has two humps"
http://www.cs.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf


[3] "Can computing science save the computer industry?"
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD09xx/EWD920.html



More information about the Python-list mailing list