Why aren't we all speaking LISP now?

Ken Latta klatta at conflict2000.com
Thu May 10 23:32:17 CEST 2001


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

"Evan Jones" <EvanJ at eyron.com> wrote in message
news:mailman.989414833.19676.python-list at python.org...
> > > What sort of teaching did the rest of you that took computer
> > > science courses get?
> >
> >
I started in high school with punch cards on a Univac 1108 running
Fortran II (this was 1966).
Two years later I went to IIT (where I'd taken the fortran class);
halfway into the semester they
started to decommission the Univac maching and replace it with a IBM
360/30. And to add insult
to injury, they cut off academic use of the Univac several weeks
before the end of the semester so they
could do final runs of the business applications on the Univac before
the switchover to the IBM happened.

At IIT I got to work for a wild Psych professor who built the
equivalent of microcomputers to control lab
equipment by buying individual logic circuit boards from a military
surplus shop (yes we're talking single
flip-flop circuit on a card!). Reprogramming was accomplished via
"sandwich board" which allowed you to
"store" alternative wiring patterns and since the wiring implemented
you algorithm.... Of course I also got
familiar with paper tape readers back then; dupping a program segment
and then cutting and pasting (uh
cellophane taping) these into strips of paper tape was my first code
reuse.

After that I went to the University of Michigan which ran a home brew
operating system, MTS (Michigan terminal system)
which was one of the excellent timesharing systems that grew up in
university environments (the MTS consortium
survivied a few decades). So I got the big jump upto DecWriters and
LA26 typing terminals for entry and Fortran IV
for programming (followed closely by Waterloo Fortran IV (Watfor)).
Michigan was part of a state wide network
called Merit which linked the big 3 state universities (U of M,
Michigan State and Wayne State) and many others.
(Footnote: we're talking mid-70s, statewide packet switched network
with local dial-in numbers over half the state; PDP 11s and LSI 11s-a
little PDP assembler is a good thing, no?).

I soon discovered APL in several variants (including a transliterated
one at Michigan where you had to type
$rh to represent the "rho" character (which was less grief than
getting the "true" character set a few years later
by prying the bios off of an IBM PC monochrome adapter to replace
with one from an APL software vendor).....

But just as I was getting into the differences between "legitimate"
languages, I discovered that there was lots
of "useful work" being done in the scripting and macro languages
tucked away in stat programs, tape drive
controllers or utility software, etc.). And there were also
"environments" like the computer conferencing environments
(the NSF sponsored the microspan project when engendered the
mainframe Confer system and the Picospan conferencing
system that still is the heart of "The Well").

So batch programs, config programs, editor programs.... Then I got
into printing and discovered that there was
an "environment" that combined programming, mathematics, device
control and editing....TeX. We had mainframe
TeX and the early pc variants and there were these engin types
running unix boxes with TeX and *LOTS* of other tools. So for a few
years (mid-80s) it looked like these different environments might
converge. I even had a Xerox Star that had one of the pc-emulation
cards in it. Oh yes, Smalltalk was seductive but I couldn't afford it
on home or even work machines). But then these PCs got out of
control. Now we had some serious macro languages (many based on Lisp
concepts) running cad programs, stat programs, data extraction
programs and a new activity called "networking."

Of course academically you were exposed to Pascal since that because
the "standard" language for a few years before
C took over. Indeed the early Macintosh toolbox routines were even
called with the Pascal syntax instead of C syntax. Long before
windows the micros became an parallel universe of their own. 6502
assembler was what you needed to make your Atari or Commodore do its
thing, but all the device specific routines (like Antic graphics on
the Atari) was non-standard. But there was another glimmer on the
horizon, I had a copy of Visicalc for the Atari 800. It too let you
"program" the functions in your spreedsheet.

And during the 6502 era you needed lots of assembler tricks were
learned to make device control possible. Some of that source code is
as opaque to me today as it would be in ancient Greek. (Let's not
even talk about looking at APL code from the 80s). Of course we had
to relearn tricks to support those monster 16-bit processors. Zenith
was a Michigan-based company at that time so we bought a lot of
Zenith PCs which were really *dual processor*, they has a Z80 and ran
CP/M in addition to the 8088 running MS-Dos 1.1 (ok 2.0 came before I
ever booted 1.1).  For a while the combination of Wordstar and dbII
was enough to cobble a business environment together whether it was
on the CP/M or MS-Dos side of the box.

All in all, 35 years of computing have gone by fairly fast. But so
many of the problems we faced years ago remain. No magic bullets
(though I've read lots of advertising for them). Lots of "no
programming needed" systems that never caught on.
But even an "old hand" like me was suckered in to the April Fools
Parrot announcement....

-----BEGIN PGP SIGNATURE-----
Version: PGPfreeware 7.0.3 for non-commercial use <http://www.pgp.com>

iQA/AwUBOvsI3XOCkvMovutjEQJ3SgCgzLhwPGd5K9IvcHvGil3sP78DUvcAn1c5
9DklkljITf58x/BqTqb27iF8
=jOUV
-----END PGP SIGNATURE-----






More information about the Python-list mailing list