[Tutor] Still confused about Python references/objects

Tim Peters tim.one@home.com
Sun, 1 Apr 2001 19:51:18 -0400


[Sheila King]
> Well, that's why I was wondering about whether Python is really a
> good first language. It seems to me, it would be much more difficult
> to get used to strongly typed languages and references/values/pointers
> after using Python first. Dunno.

I'd say this depends on what you consider to be the *goals* of a first
language.  Python, like ABC before it, thinks newcomers should be able to do
"interesting" things their first day.  If newbies aren't to spend their
entire time wrestling with the tool instead of with the problems they set out
to solve, their first language should hide as much artificial complexity as
possible.  Things like worrying about how to allocate memory, or the
differences between objects and pointers to objects and pointers to pointers
to etc, are artifacts of using low-level ("close to the hardware") languages,
not inherent characteristics of interesting problem domains.

It so happens I learned assembly language first, as close to the hardware as
things get.  And I recommend that everyone learn assembler first too --
provided they intend to make a career of writing compilers, which I intended
at the time <wink>.  For everyone else, the higher level the better, lest
they give up in frustration the first week.

It's quite possible they never need to learn another language, in which case
Python is a fine place to stop as well as to start.  If they need to learn C
or C++ or Java or ... too, the maze of new rules and requirements will drive
them mad, *until* they learn something about how computers work internally.
Without learning the latter, the *purpose* of all those low-level
restrictions will never make good sense to them.  They may learn them by
rote, but they'll never achieve true skill before understanding the
machine-level purposes behind having 16-bit ints *and* 32-bit ints *and*
confusing pointers with arrays *and* confusing characters with little
integers *and* etc etc.  That stuff is all driven by what's easiest for the
*hardware* to do.  In Python's view, you simply don't need to worry about all
that at the start (or, if you're lucky, ever!).

BTW, Python is more strongly typed than C or C++:  Python *never* lets you
get away with a type-unsafe operation.  I expect you have in mind that Python
isn't *statically* typed (Python delays type checks until runtime).  Static
typing can also be hard to get used to, in large part because it's yet
another little sub-language with its own arcane rules to trip over.  If the
purpose of a first language is to teach people that programming languages are
a veritable minefield of inconsistent little gotchas, ya, Python is a rotten
first language <wink>.

let-'em-taste-success-today-and-tomorrow-is-time-enough-for-failure-ly
    y'rs  - tim