Popular conceit about learning programming languages
Dennis Lee Bieber
wlfraed at ix.netcom.com
Mon Nov 25 01:01:07 EST 2002
Bengt Richter fed this fish to the penguins on Sunday 24 November 2002
04:30 pm:
> I've coded recursive algorithms using 1970's fortran, where all
> storage was statically allocated as far as the language and its
> implementation at the time was concerned. Fortran didn't give me the
> recursive concept to work with (nor to learn, if I had been ignorant
> of the concept). It didn't stop me, but no reasonably workable
> programming language can *stop* you, in theory at least. Drudgery is a
> real and practical obstacle though ;-)
As I recall, that was one of the assignments in my advanced FORTRAN
class (and in those days, we didn't even have FORTRAN 77 to work
with!). Essentially the conversion of recursive algorithms to iterative
structures.
> I think you are reinforcing my point that the abstractions are the
> important things. Python is certainly not an exclusive source of
> useful abstractions to think with, but it makes them unusually
> accessible and fun to use concretely, and that makes it more likely
> that a programming newbie will learn them early on.
>
Agree that Python (and a few other languages) do offer much more
"high-level" concepts -- have you noticed most of those languages are
also traditionally implemented as interpreters vs true native
compilers? I think where we differ is that I feel one needs a grounding
in computer science theory to realize how to take language constructs
and extrapolate to the abstract underpinnings -- the stuff one gets
exposed to in things courses on "data structures", "algorithms",
"language design" (not compiler design)... Classes that are independent
of any given language (for fun, try implementing a "hashed-head,
multiple linked list" algorithm using an early 70s BASIC that only
allowed for FOUR open data files at a time <G> -- in 25 years I've only
seen ONE place where that structure was used in a way that was visible
to a user: The AmigaOS filesystem).
> you are presumably referring to getinterval, which operates on arrays
No, at least I don't remember that operator by name.
> or strings? That's defined to share with the original, AFAIK. Are you
> saying the program you fixed made the wrong assumption, or are you
> saying PostScript should make copies? That's a change, if that's so
> now, I think. Is there another operator name that does that (I
> wouldn't think they'd want to break backward compatibility with old
> level 1 printers)?
>
The wordprocessor preamble files would have worked perfectly with a
PostScript printer. What I had could be considered very rudimentary
preludes to ghostscript; these programs supposedly interpreted
PostScript and generated bitmaps to be sent to the Amiga printer
drivers (which only handled ASCII text OR a graphic image which they
could scale to the printer page -- the wordprocessor native graphic
output was at screen resolution, 72DPI, which was terrible on anything
more modern than an MX80 printer). The problem is that the people who
implemented these (SaxonScript was the second one I had) did /not/
follow the full PostScript specifications -- they took simplifications,
like copying substrings to new memory rather than tracking pointers
into the original string -- with the result that the sneaky stuff done
in some wordprocessor preamble files would fail. I had to modify the
preambles to work with the problems in the interpreters.
> That's the pleasant part of having broad experience. But if you
> totally ignore the constraints of the language you are about to be
> forced by circumstance to use, you may get into expensive work against
> the grain (not to mention your gag reflex ;-).
>
<heh> Are you describing my assignment of two years ago? The design
was valid, the implementation /did/ work in the end... But VMS ASTs are
/not/ a good substitute for a POSIX threading implementation...
> If you then reconsider, and think using the subset of abstractions
> easily implementable or already available in the language to be used,
> then I would call this pragmatically restricted thinking "thinking in"
> the particular language. Of course languages overlap, so some common
> denominator stuff may not suggest that thinking with it is "in" any
> particular language.
>
Well, at one level (I've always considered myself one step below that,
based on the way my former employer split job titles) you don't know
what language will be used to implement a design (our "chief system
engineers" knew very little, if anything, about coding -- one of them
was using something on par with late '60s BASIC!) Granted, they usually
obtained advice from the peons, but the actual system design had no
language specific concessions. Us peons then had to take this appoved
design and develop the software design for it -- and may have had
little, if no, choice of language.
--
> ============================================================== <
> wlfraed at ix.netcom.com | Wulfraed Dennis Lee Bieber KD6MOG <
> wulfraed at dm.net | Bestiaria Support Staff <
> ============================================================== <
> Bestiaria Home Page: http://www.beastie.dm.net/ <
> Home Page: http://www.dm.net/~wulfraed/ <
More information about the Python-list
mailing list