What is different with Python ?

Terry Hancock hancock at anansispaceworks.com
Wed Jun 15 13:27:15 EDT 2005

On Tuesday 14 June 2005 02:12 pm, Andrew Dalke wrote:
> Teaching kids is different than teaching adults.  The
> latter can often take bigger steps and start from a
> sound understanding of logical and intuitive thought.
> "Simple" for an adult is different than for a child.

Of course, since children are vastly better at learning than
adults, perhaps adults are stupid to do this. ;-)

Quantum mechanics notwithstanding, I'm not sure there
is a "bottom" "most-reduced" level of understanding. It's
certainly not clear that it is relevant to programming.

It is invariably true that a deeper understanding of the
technology you use will improve your power of using it.
So, I have no doubt that knowing C (and the bits-and-bytes
approach to programming) will improve your performance
as a programmer if you started with Python.  Just as learning
assembler surely made me a better C programmer.

But I could write quite nice programs (nice enough for my
needs at the time) in BASIC, and I certainly can in Python.

Sometimes, you just don't care if your algorithm is ideal, or
if "it will slow to a crawl when you deliver it to the customer".

When did the "customer" get into this conversation? I mean,
you're a neophyte who just learned how to program, and you're
already flogging your services on the unsuspecting masses?

In my experience, people first learn to program for their own
needs, and it's only a long time later that somebody decides
they want to become a professional.  And maybe, when you're
a pro, knowledge of machine details is really important.

But an awful lot of people just want to putz around with the
computer (or more accurately just want to solve their own
problems using it).  Surely, they don't need to know anything
about quantum transitions, transistors, or malloc to do that.

In fact, I find such stuff introduces a lot of noise in my
thinking with Python.  Awhile back I wanted to write a
program that would take a title like "My fun trip to Europe,
and a thousand and one things I saw there"  and make a
mnemonic file-name less than 32 characters long out of it,
that obeyed simple naming conventions.  I wanted it to spit
something out like "trip_europe_1001_things", the way a
human might name such a file.  Well, first of all, with all
the clutter involved in processing strings, I would never
have tried this in C, it would've taken me a month!  But
my real point follows ...

I used several different methods to go about squeezing down
a title to get rid of "less important" words.

For one, I wanted to know what the most common words were,
so I could make a hit list of words to delete from titles. I did this
using three or four titles from Project Gutenberg, and the Python
interpretor.  It took maybe fifteen minutes, and went something
like this:

s = open('cia_factbook', 'r').read() + open('pride_prej', 'r').read() + open('alice', 'r').read()
words = s.split()
unique_words = {}
for word in words:
    unique_words[word] = unique_words.get(word, 0) + 1
word_freqs = unique_words.items()
word_freqs.sort(lambda a,b: cmp(a[1],b[1]))
for word, freq in word_freqs[:100]:
    print "%5d:  %s" % (word, freq)

This is, of course, totally brutal on my computer's memory allocation,
because the source data is quite large.  So, my C programming
instincts would've encouraged me to do all kinds of optimizing. But
why bother?  Like I said, it took 15 minutes.  Done.  And that includes
the programming time.  I took a completely naive approach and
it worked, so why should I bother making it hard for myself?

Sure, in the profession of writing commercial, off-the-shelf word counting
software to sell to "the customer", I should be truly ashamed, but who
cares?  I didn't even save this program to disk, I've just tried to rewrite
it from memory here.

The idea that one must learn assembler to learn C and in turn
to learn Python strikes me as elitist or protectionist --- just another
way to turn people away from programming.

You should learn that stuff when it becomes obvious that you need it,
not from the outset.  And for many people, it simply may never be
needed.  Python is actually remarkably good at solving things in a
nearly optimal way.

This program also had another interesting property -- it couldn't have
a rigid specification. There's no way to write one, it has to succeed
intuitively, by producing output that is "mnemonic".   But there's no
way to unit test for that. ;-)   So no amount of "deep understanding of
what the machine is doing" would've really helped that much.

I think there's an awful lot of programming out there that is like that --
the problem is about solving a problem with appropriate *ideas*
not trying to find the most efficient *methods*.  Often, it's not how
efficiently I can do a thing that interests me, but whether it can be
done at all.  

I don't think total reductionism is particularly useful for this.  My
thermodynamics professor once argued that, never having seen any,
but only knowing the laws of physics and thermodynamics, that
physicists would never have predicted the existence of *liquids*,
let alone oceans, nucleic acids, and life forms.  The formation of those
things is still far beyond us on first-principles, and surely even if we
do come to an understanding of such things, the particular biology
of, say, flowering plants will be almost completely unaffected by such

Likewise, the creation and study of more complex ideas in software
is unlikely to benefit much from assembler-level understanding of
the machines on which it runs.  It's simply getting too close to the
problem --- assembly language problems have become so well
understood, that they are becoming a domain for automated solution,
which is what high-level programming is all about.

Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks  http://www.anansispaceworks.com

More information about the Python-list mailing list