[Edu-sig] As We May Think: What will we automate?

kirby urner kirby.urner at gmail.com
Tue Mar 24 18:31:38 CET 2009


On Mon, Mar 23, 2009 at 3:11 PM, Edward Cherlin <echerlin at gmail.com> wrote:
> On Mon, Mar 23, 2009 at 2:38 AM, kirby urner <kirby.urner at gmail.com> wrote:
>> On Mon, Mar 23, 2009 at 12:42 AM, Edward Cherlin <echerlin at gmail.com> wrote:
>
>> OK, but I think we agree APL is a professional language that gets
>> plenty of respect from the professional community.
>
> APL is on the boundary. It certainly gets no respect from C/C++
> programmers, and almost none in academia. Professional APL programmers
> certainly exist, and are almost universally treated as "APL bigots".
>

Interesting.  When I showed up at Princeton in 1976, they had APL
terminals scattered around campus, including in the dorms (e.g.
Princeton Inn, where I was freshman year, room mates with Dr.
Sonnenfeld, now a "professor of lightning" at NM Tech, former disk
drive engineer for Maxtor, coming from a relevant background in
electron tunneling microscopy -- similar 'needle meets platen'
concerns).

Anyway, APL was clearly held in high esteem by the computer science
gurus of that day, Iverson being something of a cult figure with IBM
as you indicate.

I read a lot of 360/370 manuals in my free time, always glad to
indulge my curiosity in a well-endowed school -- what I was there for,
after all.

I programmed Battleship (call out coordinates, used to play with my
sister in Italy) then read the APL code over the phone to my friend
Glenn Baker at Brown, who later became a TV director/producer, heads
the Cuban program at Center for Defense Info), probably one of the
lower baud rates on record ("paren paren, box, arrow, x, paren,
paren"...) but then APL does so much with so little -- a Dymaxion
computer language for sure (smile).

The guy was a genius, no question.  J is just as weird.

>>>> <rant>
>>>>
>>>> This is what I most disagreed with in Keith's remarks.  I don't think
>>>> all those centuries upon centuries of encoding our thinking in lexical
>>>> constructs, grammars, is suddenly on the verge of evaporating in to
>>>> some drag and drop cartoon-like haze.
>
> +1
>
>>>> Rushing to shield children from the supposedly harmful effects of
>>>> lexical coding, while on the other hand stressing reading books and
>>>> imagining while so doing (i.e. not needing those pictures "as a
>>>> crutch"), is to work at cross purposes in a self defeating pattern.
>
> Our idea is not shielding children permanently, but only in a certain
> introductory phase. It is an advantage at the very beginning that you
> cannot make a syntax error in Turtle Art. Then after you understand
> program structure, variables, and such, we can introduce syntax. It's
> what Edsger Dijkstra called "separation of concerns", which is
> completely in opposition to "throwing the baby out with the
> bathwater".
>
>>> I assume that this rant is directed at my Turtle Art proposal.
> [snip]
>
>> Not really directed at Turtle Art proposal no.
>>
>> I think it's more I get this feeling when in the Squeak and Smalltalk
>> world, that there's a backlash against lexical coding as that means
>> typing (as in typewriter, keyboard) and little children aren't so good
>> at that, ergo need like affirmative action where we use eye candy
>> approaches, "friendly" peripherals, and let them conceptualize these
>> "like programs" in a dream-like Disney-esque environment.
>
> Typing is a consideration. Syntax errors are a consideration. I don't
> care about eye candy or Disney. Do you have a beef with Affirmative
> Action? I have seen it done right and done wrong. I'm against doing it
> wrong, but not against it itself.

I think I'm trying for "right now realism" with students and wanting
to minimize any "pie in the sky" aspects.

When I was smaller, I heard adults always talking about the future
we'd inherit, had a passing the buck flavor, not that I didn't want to
inherit a future mind you (I did, still do), but there was always this
wistful "could be doing" angle, whereas what was really happening was
dismal and dreary (lots of carpet bombing for backdrop, is how I grew
up -- people freaking out everywhere you looked, including in the Oval
Office -- much better now).

So right now, programming at an adult level, when not using a "game
show" system like Alice, is very stark and austere.  Syntax matters.
It's still somewhat hard to do.

A lot of these kids are coming from hours of TV, fast cuts and high
bandwidth, experience school as a screeching horrific slowness as
adults slam on the brakes and do this very low bandwidth thing in the
front of the room called "teaching", which is so *not* like Bill Nye
the Science guy, who runs ya through 10K bits worth of science in like
5 milliseconds, whereas this not-on-TV teacher is still saying "the".

I don't want my students to think I'm trying to slow them down too
much, encourage them to watch our geometry YouTubes, roll there own,
but when it comes to computer programming, I don't want to indulge
unrealistic fantasies of instant gratification, i.e. after 10 minutes
at the keyboard, you'll be pumping out a movie like 'Shrek' or 'Cars'
("how about 10 hours?" they ask me).

And yes, we're getting closer, if you're happy to work with canned
characters you don't have to design yourself, and preset motions, in
which case a Python API of sim.dothis(args) and sim.dothat(args) makes
plenty of sense.  I was able to program this movie in a few minutes,
just using a few keystrokes:

http://worldgame.blogspot.com/2009/02/regarding-objectifying.html

Putting it another way, you could say my bias is to have zero interest
in what we might be doing five years from now.  I'm interested in
education reforms we might institute within the next 24-48 hours.
Rolling a projector into the room and starting to show Python,
starting to teach "boring math" with these new tools, is within range
of many a Portland classroom, not in five years, but tomorrow (except
it's spring break, so yawn, not interesting).

Here's from my blog from a few years back, just to give more of the outlook:

"""
An important point to emphasize with a new generation of
computer-savvy student, wowed by eye candy, is that even if the
graphics are glitzy, the source code behind them is still text-based.
In other words, the programmer's world still looks more like free
verse poetry than Saturday morning cartoons.

One way to show this is to start with a VRML graphic, such as the
5-frequency icosasphere in the right window above, let students
interact with it, using interface controls (Cortona's in this shot),
then switch to the left window, where a scrolled session demonstrates
a corresponding text-based API. What bridges left and right is a lot
of scripting language, in this case Python. These scripts take user
parameters and generate the underlying .wrl file (world file), which
the VRML browser then processes and displays.

Likewise, even where the effective use of such scripts is to valve
electrons on a motherboard, in order to make state changes to a hard
drive, the engineers who designed this mutiple layering of hardware
and software used a lot of human-readable text in the process. We're
still firmly anchored to what we most need: comprehensible readings
(even if highly technical sometimes).

The moral of the story is, contrary to outward appearances, the
culture is not moving away from reading, i.e. the process of
eye-balling text to extract meaning. The right brain is getting more
of a workout, what with all the visual cortex stimulation, but so is
the left. So the old balance between left and right is still
recognizable.
"""

[ http://worldgame.blogspot.com/2005/01/text-drives-graphics.html ]

>
>> I skeptical of this tendency to over-romanticize the "child prodigy
>> programmer" as if we should be spending big GNP on overcoming any need
>> for a keyboard,
>
> Non sequitur. I'm interested in how early we can introduce programming
> concepts to all children. First grade? Preschool, using iconic tiles?
> But two-year-olds can learn to read and type, as demonstrated by Omar
> Khayyam [sic] Moore in the 1960s. It doesn't have to be an either/or.
>

Somewhat a non sequitur I agree, but I do get these hand-wringing
parents with some geekazoid for a kid, wondering if they should be
sending him to "a special school" where presumably they have LCDs
bolted to the bottom of bunk beds, computer camps where you can use
your mouse to order a breakfast of champions.

Like this 8 year old at Winterhaven (our geek hogwarts):  he had his
Thinkpad partitioned for five operating systems, had figured out how
to host telnet and install putty on an NT box in the school, patching
in over the subnet.  3rd grader.

My attitude is kids this age deserve a normal childhood playing with
friends, developing their own personal coordination skills.  All in
good time in other words.  A child that self motivated will find ways
to keep honing those skills, with encouragement, but now that we have
the Internet, there's really no need for that "special skool".

You can develop that Google Appengine from the comfort of your own
bedroom, in collaboration with friends.

>> so Johnny Neutron can just "program in his mind's
>> eye".  Similar to my rant against chip implants, trying to bypass eyes
>> and fingers even when these are working OK.  Even though this
>> technology is still somewhat far off (in terms of the Hollywood
>> versions) I think it sends the wrong message to even hype them as
>> desirable, as if we'd all be happier if this were Wall*e world
>> (everyone on their backs with a big gulp, thinking in Smalltalk).
>
> Wow, that's some real rubbish you're rubbising, ya.

Yeah, but you'd be amazed how many people want $10 million so they can
come up with the "interface of tomorrow" in which we cleverly bypass
all that biological equipment that represents an astronomical R&D
budget "because we can" (or think we can).

Prosthetics are one thing, but suggesting we all cut off our arms to
be outfitted with something made by Mattel -- why do this consciously?
(i.e. lets bring those troops home).

I think DARPA should send anything "borg like" to the shredder, i.e.
that "soldier of tomorrow" scifi has just ruined a lot of engineering
shops, turned CTOs into numbskulls.  Put a big sign over the door:  No
Borg!  Forget about it!  Then maybe the grant proposals would improve?

>
>> You're not in that camp, or at least not with this project, is what I
>> hear you saying.  But that that brings up the question of what age
>> range you're aiming at and do they know how to type.
>
> 2+, eventually. Maybe. We can teach them.
>
>> So again, I'm not a detractor w/r to Turtle Art, hope you won't think
>> of me that way.  If you're getting across tree structures, that's all
>> to the better I'm thinking, because of the DOM and XML more generally.
>>  XML is a tree structure, in terms of nodes having children and not
>> having more than one direct parent.
>
> XML is, to me, just LISP with named parentheses and nice pretty-printers.
>
>> In the old days, when we'd teach grammar, including Latin grammar
>> (because it's hard, consistent, and detailed -- was the theory, this
>> is before my time), students would visualize parsing, these parts of
>> speech.
>
> I did lots of diagramming sentences as tree structures on paper and blackboard.
>

We also like networks, just as much as trees.

A polyhedron is a network (topologically speaking, the way we teach
'em).  We might start with Euler and those bridges, then trying to
draw those houses with one pen stroke, not overtracing any line
(remember that one? -- can't do a double-house).

>> << SNIP >>
>>
>>>> I sense countervailing temperaments at work.  Some people just feel
>>>> better when there's this sense of a unified substrate, a primitive
>>>> simple beginning.  Others are less monotheistic in their proclivities
>>>> and don't mind a more heterogeneous "toolbox" model.  You get down to
>>>> a bunch of very different things, working in complement.  But there
>>>> you have it:  "everything is a thing" in basic English (a useful
>>>> design pattern, even though ridiculously misleading).
>>>
>>> There are quite other motives at work. One set is that some people are
>>> willing to take on a greater memory load than others.
>>>
>>
>> I share your interest in using memory efficiently.  It's just that I
>> like where Python "begins" which is "in the middle" with hash table,
>> array / list, class itself (a kind of data struct w/ behavior),
>> already built, ready to rumba.  Problem I have with Scheme, as a
>> starter language, is you either need to import, or code from scratch,
>> these kinds of data structures that aren't primitive, but can easily
>> be built (again and again, seemingly endlessly).
>
> Other LISPs have arrays and suchlike.
>

Yeah, I'd like to watch a few DVDs on LISP, have good symbol
animations taking me through it, like on ShowMeDo.  I might code along
with my laptop, or just do email watching from the corner of my eye.
I've done some LISP, sure, but only for the hard fun of it.

At Princeton, I was mostly too busy reading Wittgenstein, and logical
languages that aren't designed to machine execute.

>> I just get impatient with low level languages I think.  I've usually
>> had the luxury of high level objects, even as primitives.  Call me
>> spoiled in that way.
>
> There is a place for low-level languages like pure LISP and FORTH.
> They make great language-building toolkits, if you need different
> abstractions than the ones provided, and don't want to carry around
> the baggage of Smalltalk when you don't need it all.
>

Absolutely there's a place.  In talking about my own "impatience" I'm
not trying to make a virtue out of holes in my skill set.  I'm just
recognizing my own mortality and limitations.  I can't square dance
either, at least not well.

>> When it comes to teaching mathematics with Python, the high level
>> works in my favor.  I don't necessarily want to spend much time
>> talking about chip internals, registers.  This isn't computer science
>> or electrical engineering.  We want to spend most our time thinking
>> about Polyhedra, Vectors etc.
>
> As in NumPy and SciPy, yes. But there are languages for chip people,
> including AHPL, an APL dialect that compiles to wiring lists. Iverson
> invented APL for describing computers, and co-wrote the paper giving a
> complete, theoretically executable definition of the 360 architecture.
> One of the first software emulators ever.
>

Yes, amazing what APL can express.

>> Yes, these are expressed as Python objects, but no, we don't feel
>> we're learning to become "professional programmers" as our goal here
>> is to get clear on these Polyhedra and Vectors.  That might be for
>> theater work, plus something about airplanes.
>
> +1
>
>>>>> "I invented Object-Oriented Programming, and C++ is not what I had in
>>>>> mind."--Alan Kay
>
>> Yes, M has a niche market in health care (where it started, where it
>> dies), but from what I've seen up close, the health care sector is
>> paying a high price for keeping MUMPS alive well past its prime.
>> There's maybe a difference between a "dead language" (e.g. FORTRAN,
>> still the basis for lotsa libraries) and a "zombie langauge" (e.g. M).
>
> You may prefer OpenMRS from Partners in Health.
>

I'm thinking the legal medical record is maybe headed for a
"schemaless" document storage such as CouchDB (Erlang) and/or Tokyo
Cabinet and/or... whereas the research registries such as I deal with
(clinical research records != legal medical records) will stay in SQL
engines (registries far less amorphous, whereas the "generic medical
record" is hellacious in any RDBMS approach -- so much could go wrong,
need a table for everything).

Reducing health care costs is all about government funding of
FOSS-based solutions that remain FOSS, i.e. today hospitals are
treated as cash cows by closed source vendors who hog all the MUMPS to
themselves (just makes 'em mad cows).  If hospitals got together the
way we do it around FOSS, with stimulus from Uncle Sam or other
government, they could start making some real headway.  The main
barrier is the public not understanding that FOSS sometimes means the
best engineering money can't buy, and that "open source" doesn't mean
"less security" (a very common misconception, whereas the opposite is
much closer to true (if it's not open source, don't trust it with your
life)).

My proposal for CRRs is to just use a state of the art MVC like Django
or Rails, why not?  They already got web page designers with CSS
skills.  The IT department knows SQL.  All they need to do is migrate
the data and kiss MUMPS good bye (easier said than done, I realize,
but its a 24-48 hour kind of thing to get started, so of interest at
least).

>> From interviews I've done, and from source code I've eyeballed, I'd
>> say M seems a crazy-making language, not at all a pleasure to use, and
>> in the hospital where I've seen it at work, it actually breaks in
>> serious ways, the equivalent of a primary key getting corrupted.  The
>> product built around it is unwieldy and deficient.
>
> The original point of MUMPS was that doctors could program medical
> informatics themselves. Apparently they are now discouraged from
> learning the language and doing so.

What happened is some doctors got started down this road, then
professional coders took over as the code base got big and user needs
multiplied, i.e. some of the best doctor-begun startups went
commercial, at which point the docs get a pat on the back and computer
scientists come in the run the place.

We're back to square one in some ways, as a next generation of skilled
heart surgeon eyeballs Python and thinks "hey, this ain't so hard"
(SQL isn't either).  But then they think:  "but wait, I'm a doctor,
trained to cure sick people -- I can appreciate and understand this
stuff, but I think I'll let the hospital IT department work with
outcomes research groups on actual implementation (I'll be welcome at
meetings though, will give plenty of input, and isn't it cool that I
really understand what they're talking about!)).

>
>> Along similar lines, I get frustrated with how SQL isn't really that
>> hard, in a simple workaday sense, where you just want some simple
>> joins, maybe based on views.  Yet Microsoft Access layers it with eye
>> candy while making the SQL itself rather thorny, hard to eyeball.
>
> QBE, a 2D visual system, is mathematically stronger than SQL.
>

I'll have to take a look.  If it's a 24 hour type of thing...

>> That's sort of the same thing I worry about with trying to give kids
>> what appear to be training wheels on training wheels on training
>> wheels -- these Dr. Seuss contraptions all designed to make up for the
>> fact that maybe they're too young to be worrying about programming?
>
> Programming is one of the most powerful ideas around for making sense
> of the world around you. Everything in human life is governed by some
> complicated combination of rules and customs. Programming makes better
> sense out of most of the rules, and some of the customs. When you
> reach the limits of programming, you know something important about
> what remains.
>
>> When you're still in a little body, the priority is maybe to get
>> outside doing some real activities, not virtual ones.  A lot of that
>> empathetic conceptualization you'll want to draw on later, isn't
>> developed just sitting in front of a computer all day.
>
> Not an either/or. Some time outside, some time in.
>

Right.  Place when you're hiking around in the Columbia Gorge you
might have a dodeca-cam on your back, streaming to the hard drive in
your back pack.  Back at the school, computers will stitch together
those seamless Google Street type views, ready for the school intranet
and web pages.  We could do this within the next 48 hours,
equipment-wise.  The dodeca-cam company is just a few blocks from
here.  But oh yeah, it's spring break and I'm supposed to leave for
Chicago in a few hours.

>> Let's just say I'm leery of "contraptions" which insulate the "end
>> user" from writing any code, and yet under the surface write reams and
>> reams of the stuff, which at some point everyone ceases to fully
>> understand...  that's the opposite of "open source" in a way, as
>> you're really just hiding the internals, keeping "end users" from
>> really seeing behind the scenes...
>
> You can do that. We in Turtle Art and Smalltalk don't.
>

In Visual FoxPro, you don't see any reams of code i.e. the Form with
the widgets exports all these pockets where you can put your code, but
the Form itself stays a black box with a well defined API.  You could
think of it as C code.  That's pretty much how it works in Python as
well.  You don't eyeball the actual code for the actual widget,
written in C++.  The widget has an API, is an object, and you use it
as such (syntactically, it's an object, Python sharing Smalltalk's
"everything is an object" way of thinking).  A lot of Foxpro people
besides me made the leap to Python awhile back.  There's Dabo for
example.  I think a lot of it has to do with the Xbase "dot prompt"
i.e. we've always worked in a shell, had that immediate APL-like
interactivity.  Python is friendly in that way.  BASIC and Visual
Basic are far less friendly.  It's hard to use Microsoft Access in
"shell mode" from a "prompt".  This slows the learning curve.

I don't think Python would have anything close to the visibility it
has today, if it hadn't been for IDLE being included.  That was the
"main battery" for so many of us.

>> Bottom line, I guess I worry about using electronics to pander to the
>> imagination in ways that undervalue developing physical coordination
>> skills.  I see how television has done this.  The younger they are,
>> the more I want to add to the motor component (gross and fine motor
>> skills).
>>
>> Again, this isn't directed against any of your projects.  I'm just
>> investigating my own biases for the record, keying off the fact that
>> MUMPS is oppressive, and that too many of my coworkers have atrophied
>> and/or undeveloped SQL and/or database skills because the so-called
>> "computer professionals" have taken a "spoil the end user" approach,
>
> There is nothing in software that I detest more than developers
> telling me what I want. I propose to create a roadmap that will allow
> children to choose where to go without misleading them. Not the
> Microsoft "Where do you want to go today?" bunk.
>

My paradigm "client" is a student bored by math class, thinking math
will never be for me, and yet there's no opportunity to try the
"object oriented" approach, even though computers with IDLE are just
down the hall.  Here's a future designing engineer, getting turned
off, with a possible solution ridiculously proximal, yet not
implemented.  This same kid looking back, realizing what was kept from
her, might think "criminal malpractice".

Now of course it's true that it'll be possible to stay bored even with
the switch to using a real computer language in math class.  However,
having done quite a few experiments in the field, I can say this is an
definite upgrade in any case.  We'll lose far fewer potential recruits
if we take this route.  I'm happy to work closely with teachers
willing to do this today, with the kids we have now.  If you're
talking "in five years" I think that's a cop out, at least in Portland
it is.  Not meaning you personally, talking about school
administrators here.

>> which I regard as antithetical to FOSS.
>
> +1
>
>> Also, on a more positive note, I think cartoon-like approaches
>> designed to be instructive around particular concepts, are quite
>> suitable for *adults* as well as kids.  In other words, once you *can*
>> do lexical programming and *do* have lots of physical coordination,
>> *then* it's a good time to get a next level of training, with surreal
>> cartoons (like manga) your imaginary friend (good way to learn some
>> computer science).
>
> And I think that visual programming is good from the beginning. We shall see.
>

Not holding my breath though.  It's time to stop waiting for any "end
of the rainbow".  Now is the time, not then.

>> It's a lot about *sequence* (the way topics get ordered).  Some of the
>> stuff we aim at children we might do better adapting for adults.  Lets
>> have more Wii in senior centers, along with more FOSS (as we learned
>> at the last OSCON, using FOSS to bridge grandchildren and grandparents
>> is a good way to go).**
>
> We will have to test these various approaches, and adopt what works.
>

Yes.  And we've been doing that.  Fortunately, it's all running in
parallel and doesn't bottleneck in one person.  Workarounds are the
name of the game, as for every person will to test something new, you
have a lot more who just want to go through those motions (the same
ones as yesterday), no matter how ill-advised.

Kirby

> --
> Silent Thunder (默雷/धर्ममेघशब्दगर्ज/دھرممیگھشبدگر ج) is my name
> And Children are my nation.
> The Cosmos is my dwelling place, The Truth my destination.
> http://earthtreasury.net/ (Edward Mokurai Cherlin)
>


More information about the Edu-sig mailing list