Teaching python (programming) to children
David Andreas Alderud
aaldv97 at student.remove-this-part.vxu.se
Wed Nov 7 19:28:16 CET 2001
>This is really interesting, because my experience is exactly the
>opposite. I have no doubt that you learned some very bad habits when
>you learned BASIC, but it is not clear to me how you got from that
>fact to the conclusion that it was the fact that BASIC is not typed
>that ruined you.
I guess this depends on where one gets schooled, where I come from the
differences of datatypes and the use are really important.
One thing that I hate to remind myself of is the use of optimal types for a
specific architecture, when I started with BASIC(a MacroAssembler based
BASIC) in the late 80's I forgot the importance of choosing the datatypes
with care and it's a handicap that still plagues me.
No knowing the importance is not the same as not feeling the effects of bad
>I believe that assembler is bad for you in much the same way that strongly
>typed languages are bad for you. Unless you really have to tickle the
>hardware, in which case how many bits there are in something is part
>of your problem domain, any representation of your problem that forces
>you to keep track of such things is doing damage to you as a problem
>solver. If you have to talk to the hardware, by all means use a
>strongly typed language. Otherwise, stay away.
Yes, I totally agree. On the other hand it's hard to make maximum use of a
language/compiler without the knowledge of compiler design, to really
understand compiler design one has to know hardware.
I'm probably damaged from the fact that I started in a bottom-up learning,
maybe top-down is really to prefer, in either way Ada95 is the middle way,
and I believe that the middle is the best way to reach a wide set of quasi
>We had a good representative of this kind of bad thinking yesterday.
>Somebody wanted help with his isnumeric function. Lines and lines
>and lines of code all trying to test to see if a particular string
>was possibly a number when all he needed was:
>>> def isnumeric(s):
... return 1
... except ValueError:
... return 0
>which I mailed him. Poor soul. Somebody ruined him by teaching him
>that computers operate on types. Now he has to fix himself. I'd recommend
>a steady diet of Smalltalk until he no longer remembered what types were,
>nay, until the idea of types positively annoyed him, but that might be
>hard to arrange.
Smalltalk is an amazing language, but since I belong in the university world
the number of choices one can make within CS are so diverse that a language
that teaches the middle way is to be preferred.
Maybe Python is to be preferred if all the students will ever do is to write
programs at such type of language.
>The other day Tim Peters wrote:
>In a world where 3/4 == 0, how long could any programmer
>survive believing that integer division and multiplcation are
>"inverses"? The antecedent is dead on arrival.
One of my reasons for rejecting Python as a language for teaching is the
fact that 3/4 is in fact 0, though I believe it will be corrected in 2.2, am
The other reason is the fact that as Python is a non-declarative language a
beginner might do a mistake like:
>>> x = 1.0
>>> X = some calculation
>>> if x==1.0:
>>> do some stuff
>>> do something else
I hope you can see the dangers in this, with Ada it is avoided. The person
learns how to use datatypes and declarations correctly.
>This is very true. But it demonstrates that the man has been hanging out
>with the bits and the types for too long. It has done interesting things
>to his skull. Precise, competant, and excellent professional things,
>for somebody that has to care about exactly where the bits go, but it
>blinds him to the plea 'Awwww. but I wanted multiplication and division
>to be _inverses_!' And that is a poignant cry from somebody whose
>problem domain _doesn't_ include bits, bytes, and how you order them
>in a computer.
>This leads me to believe that it is likely that we have different
>standards for sloppy code. Do you mean 'written with little or no
>concern as to where the bits are?'. That's a different sort of
>sloppiness than the sort that usually bothers me, unless of course I
>am writing a device driver or a compiler or something that _cares_.
True, very true. As you said, it might be an environmental damage from my
>From my point of view, _any_ version of isnumeric that doesn't use
>try and except is sloppy, no matter how much precision went into
>determining all the possibly string inputs and what to do with each
>case. (Unless you are Tim Peters, and rewriting format, for instance.)
>It is the sloppiness of bringing the wrong part of your mind to your
>job. Shooting flies with anti-aircraft guns. And strongly typed
>languages encourage you to think on too low a level of abstraction
>for most programming problems.
Yes, but then, aren't we supposed to teach programmers how the computers
work? Or am I just a relic of the ancient world of bits, shifts and
>I am curious: my experience with C++ has made me conclude that the
>only people who are any good at programming C++ were also good at
>programming assembler when they learned C++. (Or until they learned
>assembler, they remained foul C++ programmers. I _do_ know one person
>who was a rotten C++ programmer until he took an assembler course, at
>which point he got a lot better.) Does your experience differ so
>much? Are there many good Ada programmers who know little or no
>assembler? If so, I would think that this is evidence that there is
>something other than the strong typing that makes Ada a good first
>programming language. C++ certainly isn't a good first language, and
>neither is BASIC, but the flaws in both those languages seem to be far
>deeper than how strongly typed they are.
Personally I believe that it isn't possible to write good code in C++,
because the language is void, IMNSHO.
My experience with assembly programmers is that they write so-so C++ code,
but C++ programmers make very bad assembly programmer.
The only thing that is so-so about Ada is complaints from the hardcore OO
developers, the fact that Ada lacks multiple inheritance; I consider the
lack of multiple inheritance a feature, because neither does Java, as use of
MI is(considered) a bad practice. While Ada lacks MI, it does support the
construction of multiple inheritance type hierarchies through the use of
generics and type composition.
The strongest features for me in Ada is the clean syntax and packages
methodology, the generic packages is a very good language construct which I
miss in ever single language that I come across.
The fact that most developers of Ada don't know assembly language is a good
sign, though many do know assembly as they have been working for the
military for a long time.
>Are they all in Lund? I know people who would like to get an Ada job,
>which they aren't finding here in Göteborg.
Before the .COM crisis WM-DATA used to vacuum clean the market, a friend
used to work as an Ada programmer in Göteborg for VM-DATA. It's really hard
to find any kind of jobs in the tech industry these days, but if they don't
mind working with hardware(embedded devices) it shouldn't cause any
problems. They should make contact with the Swedish military, they always
want new talented Ada developers.
More information about the Python-list