[Tutor] Python Interview Questions..

Alan Gauld alan.gauld at btinternet.com
Tue May 31 02:12:24 CEST 2011


"Steven D'Aprano" <steve at pearwood.info> wrote

>>>>> Java just isn't a hard enough language to separate great 
>>>>> programmers
>>>>> from plodders (neither is Python, for that matter) because 
>>>>> pointers
>>>>> and memory allocation are taken care of automagically.
>>>>
>>>> I fundamentally disagree with his stand on this.
>>>>
>>> Not sure what you're saying here Alan -- are you saying you 
>>> consider Java
>>> "hard enough language to seperate great programmers from plodders"
>>
>> Yes, I'm saying the language just isn't that significant.
>
> Sorry Alan, you confuse me. Do you mean Java isn't that 
> *insignificant*?

I appear to have confused several folks :-)

I also have gone back and read Joel's article again and
although I still disagree with his insistence on pointers (and
recursion) as critical items I don't disagree with the general
drift of his article. I do think he over-rates pointers and recursion
though. Some of the greatest programmers I've worked with
come from a COBOL background with neither pointers nor
recusion in sight. But they could do stuff with file I/O that
would make your hair curl! And they knew every trick in the
book for processing data including tweaking database execution
plans and raw data file access tricks that most DBAs have
never dreamed of.

But the point I'm making is that being a great programmer is
about the ability, as Joel says, to think at multiple levels of
abstraction, but the machine memory level doesn't need to
be one of them.  (Recursion I'm prepared to allow since its
a more generic skill and applicable to whole classes of
problem that are almost intractable without it - even if you
do have to unravel it later for performance or scalability
reasons.)

And for that reason I have no issues with Java being used
as a teaching language any more than I have COBOL or
Fortran or BASIC. I don't like any of them for my personal
use but I've used all of them in anger (except Fortran) and
none of them offer any fundamental obstacle to me building
any algorithm I want, some just make it a little easier that's
all. So they can use any language they like to teach stuff,
so long as they are teaching the right stuff. And that is
where many CS classes are failing - and, I think, what
Joel is really bemoaning - they don't actually teach CS
they teach "programming" in a particular language
(whichever it is).

There was a time when every programmer needed to be
aware of the resource usage of every bit of code, but those
days have long gone unless you are working on very small
embedded devices. The simple fact is that modern OS's, tools,
hardware and networks make those kinds of optimisations
premature at best and suboptimal at worst (many optimising
compilers can out-optimise most programmers given
straightforward code, but give them "optimised" code
and the end result is worse not better!). On the very few
cases you need to optimise at machine level you can take
your time and learn as you go, or recruit an old-hand who
remembers that kind of thing (in the same way the old
hands have to recuit the new-blood to grasp web concepts
and declarative languages etc)

Meanwhile, programmers are being asked to produce code
that is flexible and configurable more than efficient. It will
need to be highly maintainable because it will change
many times in its life (No room for "Mel" here) and it must
be done at minimum cost (software engineering not computer
science). A great programmer nowadays has to deliver on
a completely different set of demands than a great
programmer in the 70's or 80's. The goalposts have moved
and so must the standards by which we judge greatness.

There are common skills that are still needed. I'm just not
convinced that manipulating physical pointers is one of those
common skills (maintaining references OTOH is still valid
regardless of whether the reference is a memory pointer!),
or that it is the best way to teach those skills that are still
valid. Or even that they are the only way to teach multi-level
abstraction - how did my COBOL colleagues learn?
And is chasing an IBM ABEND any different from debugging
a segv?

-- 
Alan Gauld
Author of the Learn to Program web site
http://www.alan-g.me.uk/




More information about the Tutor mailing list