[Edu-sig] CyberSmiths Progress II - On to QBasic?

Dennis E. Hamilton infonuovo@email.com
Fri, 24 Mar 2000 13:32:27 -0800

I had so much fun relating my 35-year-old son Doug's initial experiences
with Python and me as a tutor, that I owe you a follow-up.

There are four things I want to add by way of progress report and memoir:

A.	What happened in our subsequent attempts at Python.

B.	What Doug has since done on his own.

C.	What I have to say about that.

D.	What I Care About


[The CyberSmiths are what my son Doug has called the two of us, appealing to
the notion of a blacksmith's son apprenticing to his father.  I am touched
by that and not about to quibble with the appropriateness of the term.]

	As the result of my previous report on our experiences, I received some
great leads to tutorials for beginners.  I downloaded and printed a couple
of the on-line versions and made them available to Doug.  He also stumbled
in "Learning Python," but he started to get some benefit out of
"Introduction to Hacking" and more practice.

	I got him to try an edit-run process using the MS-DOS edit utility that
comes with Win98, and python as a console application under 4DOS.

	We tried some more extensive examples and the following occurred:

	1.	He began to see more possibilities with interactive applications (e.g.,
ask questions, do things based on the answers, etc.)
		He noticed what was convenient and inconvenient, and appeared ready to
discover how to make little programs more useful, by having them repeat
rather than have to be rerun each time..

	2.	He had a big insight about the difference between writing and filing a
program and running/interpreting the program.  That is, he began to see the
separation between description or creation and then active use.  I could
tell this was a pretty tenuous thing, but it is of course pretty fundamental
to what will come later.

	3.	Before he tired in our final session he was stopped at an interesting
place.  It was the use of f(), the notation we are now all so accustomed to
that we don't see anything weird about it.  Doug has been working himself
through college algebra using computer-based training courses of one kind an
another -- it is one of the things he has found valuable about computers and
about the Internet.  But he wasn't ready for f and f().
	He stopped at that point.  I was excited to think that we might, in a
session or two when we next got together, take the difference between
mention and use (and the different ways that comes up in using computers) to
a level where he would get why this kind of unfamiliar notation and
preoccupation is so important in how computers operate.


Well, yesterday I had an on-line messaging session with Doug and learned
that he'd obtained a book that introduces programming via QBasic and then
branches out from there.  You got it, QBasic.  I actually have a copy of
that software lying around, but not one that is designed for Win32 long file
names and Fat32 drives.

	Talk about a shattering experience!

	Well, I get to take my own advice about being tolerant and patient.

	So I checked around on where I might find QBasic and gave him some tips on
using it.  Actually, it and the MS-DOS text editor are both CUA applications
that have pretty nice Help and other information.  A nice thing about QBasic
is that there is a separate immediate-mode window, at the bottom of the
screen, and then an edit window in most of the remaining screen.  I mention
that only because it may help Doug to distinguishing something about
edit-run-save-load approaches versus do-it-and-forget-it mode.  I don't know
what QBasic does if you launch it with a filename in the command line -- I
suspect it does not do what the Python processor does under the same
conditions.  These are more things for me to observe around how Doug is
teaching himself about computing and programming.

	I even sent him a Zipped package of the program that I had been saving just
in case I might ever want to install it again.

	I think the most telling aspect of this journey is best expressed in Doug's
own words, from an E-mail I received this morning:

	"Thanks for the email on QBASIC.  I found it kind of complicated so I
looked in the book I have to see where they thought QBASIC is in Windows and
it happens that there is a version on the Windows 98 setup disc so I
installed it using there instructions.  The book I have is written in a very
simple way which I prefer cause I'm so new to this and I don't want to give
up on programming.  In fact I'm getting really excited about it thanks to
this book."  [Doug and I have the same habit of interchanging words like
their and there, two and too and to.]

	So I will go look at my Windows 98 install disks and see if the QBasic
there is more Windows and Fat32-friendly than the version 1.1 I have been
holding onto.

	From this I see I mentioned things in my E-mail that were deeper than his
understanding, so he found another way to satisfy his itch.  The gratifying
part is that he isn't giving up.  I actually recommend the old books on
basic (how to do tiny interactive games, other little tasks that are
self-contained and easily understandable outside of the context of computer
programming and software) for people who want something to practice with.  I
can't wait to see what Doug produces in QBasic.  There may then be an
opportunity to then learn more about Python by comparative example.

	It is difficult for me to realize that I speak at a level that requires an
understanding the people I am speaking to don't have.  For what I have to
offer to be accessible to someone else, it is going to take a great more
discipline to put myself in the other's shoes and to accept their experience
of the matter.  So I am going to have to be more generous and patient in
having my son train me in what I have long forgotten -- what it was like to
not know anything about computers and from there develop mastery of software
and computer science.


I am not distressed about Doug using QBasic, or any other form of Basic,
though I think ones closer to the original Dartmouth Basic are the most
ideal for this kind of teething.  I don't think it will be valuable for him
to start using Peek and Poke and In and Out commands for direct
input-output, but I think this can be an useful way to deal with some
fundamentals.  And then it will become time to learn something else.  I
think learning more than one computer language is an important part of
learning what programming languages are about, so I am not fearful that this
is an invalid experience for Doug.  (I recommend that people develop a
mastery level of at least three noticeably-different computer languages. I
support and argue for the continued use of a fictitious machine in the Art
of Computer Programming because it assures that people will be exposed to at
least two computer languages that way.)

	From time to time I run into a translation of a famous statement.  It goes
like this:

		6.54 My propositions are elucidatory in this way: he who understands me
		finally recognizes them as senseless, when he has climbed out
		through them, on them, over them.  (He must so to speak throw away
		the ladder, after he has climbed up on it.)

I am reminded of Kirkegaard who spoke on how we get to a particular level in
our lives using particular means, but those are not the means that will take
us to the next level.  I watch 10-year-old grand-nephew Jake struggle with
mastery of arithmetic and wonder if he will be practiced enough when he is
asked to suspend his approach to it and take a fresh look via algebra and
trigonometry.  Yet I counted on my fingers under the desk just as he does,
and there did come a time when I did that no longer.  I still write my
carry-amounts at the top of addition columns though.  And then what happens
when knowledge expressed in algebraic terms is suspended and recreated using
the calculus?  And then shall we suspend that and learn about real functions
and ultimately about computational/constructive objects and their
limitations?  And rediscover arithmetic through the eyes of Peano, Frege,
and Russell?  A journey up many ladders as part of a promising and perhaps
unending developmental progression.  It is hard to remember what it was like
when I didn't have my latest ladder.  It is even more difficult to
anticipate what ascending the next ladder will require me to give up, even
if only for a time!

	In his Nobel lecture, Richard Feynman pays homage to a notation that he
once developed and that gave him purchase on an important area in physics.
But he had since retired that particular notation because it did not support
the models he needed to explore for the areas that opened up afterwards.
Einstein was also a master of notations for bringing tractability to
difficult problems.

	As a self-taught student of symbolic logic, I could work through early
chapters of Principia Mathematica and other books, but I was completely
stopped on modus ponens, the fundamental principle of inference.
[Basically, modus ponens is the principle that given that *a*, and given
that *a implies b*, one can infer that *b*.]  It was clear to me, from
writings of Quine and others, that there was some difficulty with modus
ponens, but I couldn't even tell what modus ponens was, or how it worked,
let alone appreciate the difficulty.  I had that definition and other
versions of it in front of me, and I couldn't get it.  It am amazed how much
of symbolic logic I could get some grasp of without grokking modus ponens.
This went on for about 5 years, from when I was 19 to around 24.  I also had
a copy of a treatise on "Elementary Formal Systems" and I could do all right
in the first (introductory) chapter, but I got stopped around modus ponens
pretty early.    It so happened that a coworker of mine, a concert pianist
by training, knew the author of the book.  So I meet Raymond Smullyan at his
home on Manhattan, and my friend introduces me as a coworker who is very
bright and knowledgable and has Smullyan's book.  I am embarassed and not
feeling so bright and knowledgable, knowing that I haven't mastered modus
ponens or gotten past the bare introduction of "Elementary Formal Systems."
The conversation is awkward, and Raymond shows me his study and that he
corresponds with people I am familiar with by name who are working in the
foundations of computer science.  I finally get over my dumbfoundedness
enough to tell professor Smullyan about my difficulty with modus ponens.  He
goes to the chalkboard in his study and draws the truth table for the
implication proposition of logic.  He carefully and simply takes me through
the principle.  And I see it.  It is perfectly, beautifully clear in that
moment and forever after.  I have never forgotten it.  And that was a
*really* big ladder to have climbed.  It took maybe 5 minutes.  Probably the
biggest part was being unwilling to continue in ignorance and asking for
help when this marvelous opportunity became available.   It was right up
there with having the nerve to ask my wife out, though it took me much
longer to have that much mastery of myself.

	One of the most profound moments in my early career as a young software
developer came in the Summer of 1961 when I happened across issues of the
Communications of the ACM in the stacks of the Seattle Public Library.  I
opened the Report on the Algorithmic Language ALGOL 60 and was enthralled to
see a programming language in what seemed to be perfect description in 14
pages of typeset text and examples.  (I could completely get it even though
I hadn't mastered modus ponens yet, too.)  We now know it wasn't all that
perfect, or complete, but the effort to produce it and exploit it was the
launchpad for an amazing explosion and development of a discipline that we
dared to call computer science.  Treatment of formal languages is perhaps
not so singularly central these days, but it was an invaluable ladder.

	The Algol 60 report was prefaced with the following statement (in the
original German):

	What we cannot speak about we must pass over in silence.

		-- Ludwig Witgenstein: Tractatus Logico-Philosophicus

I have no idea what the authors of the Algol 60 report had in mind that led
them to choose this quotation.  It was many years later (and long after I
had a copy of the full text, unread) before I noticed that this often-quoted
final line of Witgenstein's "Tractatus" was preceded by the paragraph 6.54
quoted above.


I have worked with computers almost my entire life.   And now my cellular
phone has more computing power than the first computer I actively
programmed, and my desktop system has more computing power than existed in
the entire world shortly before I started.  I have been fascinated by the
elusive qualities of computers that has them be so powerful while so
obviously limited.  That started when high-school classmates and I struggled
with ancient issues of Scientific American that proposed how we could build
a tiny machine of our own, called Simple Simon, that exhibited everything
there is about computing.  It was all an enchanting mystery and, at first,
completely impenetrable.

	I love exploring computation and the power of computing (and even more, the
magnificent limitations) for its own sake.  But I don't think I could have
sustained a 40+ year career around software development if that was all
there was to it.  There is also something elusively fascinating about our
relationship to computers and how difficult it is to comprehend them and
have the software we provide for them be genuinely useful to people.  I
think most of us think there is something valuable here, and we want to
convey that value to others, contributing something useful and lasting.

	Yet so many computer software projects fail, so many successful ones (that
ship) are found to have been misguided in some way.  It pains me, not only
in my own experience but in watching young developers, as obsessive as I
was, be frustrated by their contributions ultimately not being fruitful.

	I have recently been exploring the proposition that "software sucks."  And
looking for ways to make software and computers even more useful and
supportive in what people are up to.  I can think of no better perspective
on learning about programming than this recent statement on software in

	"Learning to use software should be as easy as learning
	the way around a new office.  A little benign exploration,
	a couple of interesting side trips, a fortuitous meeting
	in the hallway, this is how we get oriented in real life.
	We should expect nothing less from our software.  The
	user should be reassured at every step and generously
	rewarded for his curiosity.  In today's information age,
	creativity, excitement, and a sense of adventure are more
	important, ultimately, than correctness.  There are no
	mistakes, only opportunities to learn."

		Alan Cooper.  1995 "About Face: The Essentials of
			User Interface Design." p.481

This is my personal manifesto for CP4E.  I want learning about software and
programming to be that kind of experience for my son Doug, my grand-nephew
Jake, and anyone else who chooses to obtain some mastery of these
marvelously simple artifacts on which we are willing to depend for so much.

-- Dennis