RE: [Edu-sig] Interactive learning: Twenty years later
-----Original Message----- From: Jason Cunliffe [mailto:jason.cunliffe@verizon.net] Sent: Sat 6/28/2003 3:14 PM To: Arthur; edu-sig@python.org Cc: Subject: Re: [Edu-sig] Interactive learning: Twenty years later
Another is the rise of the mouse as a computer device. People had the peculiar idea that one could deal with the world of learning purely by pointing.
hmm.. I don't think the idea of mice was that could 'do everything by pointing'.. ============================= Of course not -- I'm sure you remember the initial development back in the 60s -- NASA/Ames? -- that created the mouse for one hand and the chord-player for the other. We were supposed to learn one-handed typing with a set of 5 or 6 double-detent keys so one hand would always be on the keyboard -- never happened! Somebody re-invents the chord player about every 10 years or so -- you can find a half-dozen for sale over the Internet for those with two-hand difficulties. The mouse never was intended to be paired with the standard keyboard. Dick S.
Seabrook, Richard wrote:
-----Original Message----- From: Jason Cunliffe [mailto:jason.cunliffe@verizon.net] Sent: Sat 6/28/2003 3:14 PM To: Arthur; edu-sig@python.org Cc: Subject: Re: [Edu-sig] Interactive learning: Twenty years later
Another is the rise of the mouse as a computer device. People had the peculiar idea that one could deal with the world of
learning
purely by pointing.
hmm.. I don't think the idea of mice was that could 'do everything by pointing'..
=============================
Of course not -- I'm sure you remember the initial development back in the 60s -- NASA/Ames? -- that created the mouse for one hand and the chord-player for the other. We were supposed to learn one-handed typing with a set of 5 or 6 double-detent keys so one hand would always be on the keyboard -- never happened! Somebody re-invents the chord player about every 10 years or so -- you can find a half-dozen for sale over the Internet for those with two-hand difficulties. The mouse never was intended to be paired with the standard keyboard. Dick S.
I wouldn't say that. My IBM Thinkpad's Trackpoint is a very combination that allows me to use a mouse (well pointer) and keyboard without having to leave the home row or move my thumbs more than a quarter inch from the spacebar to click. I wish I could find a standard desktop keyboard with one (Lexmark used to make them but they are NLA as far as I know) Bill
On Saturday 28 June 2003 10:21 pm, Bill Bradley wrote:
Seabrook, Richard wrote:
back in the 60s -- NASA/Ames? -- that created the mouse for one hand and the chord-player for the other. We were supposed to learn one-handed typing with a set of 5 or 6 double-detent keys so one hand would always be on the keyboard -- never happened!
I think the question about the "chord player" (never heard it called that, but then I'm probably looking at a derivitive concept) is whether it is harder to learn to type using it or to learn how to type on a standard keyboard one-handed, which I do quite a lot (anybody who does a lot of CAD drawing will wind up doing this, I also do it a lot while programming). It's largely a question of availability -- AFAIK, the chord systems are not consistent, so you'd spend time learning each one, and they are not so widely available that the user can expect to find one on a computer they want to use (i.e. it's custom). You have to be really motivated to customize your computer to a non-standard input device. And I'm not sure the improvement over a standard keyboard is enough to motivate most people. Of course, the caveat is that I've never tried one. Maybe if I tried one at a fair or something I'd get sold on it and want to buy one.
I wouldn't say that. My IBM Thinkpad's Trackpoint is a very combination that allows me to use a mouse (well pointer) and keyboard without having to leave the home row or move my thumbs more than a quarter inch from the spacebar to click.
Yeah, the trackpoint concept is nice. I used to find it a little disconcerting, but I sometimes find myself prefering to do certain tasks on the laptop because of it. I always snicker a little when people talk about "home keys" and such -- I guess you must've learned to touch type in a class? I learned it by typing on computer keyboards *a lot*. So, I can actually type with either or both hands (faster with two, though), and if I'm in a very graphic-intensive task, I'll type with the left and mouse with the right. I go to two-hands when typing email, though. You might say I don't do it "right", though. But I can indeed type without seeing the keyboard, so I guess it's still "touch typing". I'm not sure I should be proud of this. Actually lately, I've had to experiment with wrist-only mouse motion while holding a baby and typing left hand even for email and stuff. Sheesh. ;-) Of course, now she's starting to "type" too, so I have to be extra careful. I'm looking forward to trying out an actual tablet -- I remember wanting one of these before even mice were commercially available, and they're actually pretty affordable nowadays. Another thing that really fascinates me is the 6-axis input devices like the "Space Ball" or "Space Puck" that came out a few years back. These are a really cool concept. I was actually trying to design something like this (hard) before I found them commercially available! Basically it's an object which you hold and can move through all 6 axes (XYZ/PYR) of translation and rotation. It'd be great for the "gripping hand" in a VR environment, or with a variable end-effector trigger as a robotic teleoperation control (really it would then have 7 axes). The downside is, I think they start at about $800 and I haven't got one. I was able to find enough information to write a driver for Linux though (and one may already exist). Someday maybe I'll try to find them again, if they're still available. :-P VR gloves by comparison still seem very gimmicky and cumbersome to me. It's hard for me to imagine using them on a daily basis. For me, the point of these fancy input devices is *not* to replace the role the keyboard plays in existing computer interactions so much as it is to make it possible to do new stuff with the computer that the text interface just can't handle. For a programmer, I'm a pretty visually oriented person, and these things are really interesting to me. Cheers, Terry -- Terry Hancock ( hancock at anansispaceworks.com ) Anansi Spaceworks http://www.anansispaceworks.com
Terry Hancock wrote: ...
I think the question about the "chord player" (never heard it called that, but then I'm probably looking at a derivitive concept) is whether it is harder to learn to type using it or to learn how to type on a standard keyboard one-handed, which I do quite a lot (anybody who does a lot of CAD drawing will wind up doing this, I also do it a lot while programming).
... There has been an enormous amount of research within the HCI literature into the question of multi-modal/multi-channel HCI design. Just to see *why* it's such a huge win to have more channels for communication, take Dragon Dictate and use it to dictate commands within AutoCAD. You find yourself becoming more productive, as you leave your pen tablet right where you're drawing, your other hand is basically free, and your eyes never *need* to leave the particular spot on which you're concentrating. A chording keyboard tries to increase the communications bandwidth between user and computer by adding yet more functionality to the (already overworked) hand "channels". It's possible to train up to greater bandwidth with such a system, but it does require considerable training, and until then the channel has reduced bandwidth. Eye tracking, voice dictation, and the like leverage as-yet-unused channels, so don't impact the existing channels (generally), and so tend to provide a greater facility immediately. BTW, the navy's "put that there" research was fascinating, combining eye tracking, voice recognition and logic to figure out the user's current and previous conversational focus (another bandwidth-increasing mechanism, "context compression"). It was for very minimalist projects (command-and-control interfaces) but it did give you some idea of what could be done with enough channels open for communication.
I'm looking forward to trying out an actual tablet -- I remember wanting one of these before even mice were commercially available, and they're actually pretty affordable nowadays.
Recommended. Been using them for years now, mice seem ridiculous when I'm forced to use them. Using sense-memory of where a particular scrollbar/button/menu is on the screen so that you don't have to look at where you're pointing can dramatically speed up your computing experience (and it doesn't interrupt your cognitive processes with a slow feedback loop for targetting the control).
Another thing that really fascinates me is the 6-axis input devices like the "Space Ball" or "Space Puck" that came out a few years back. These are a really cool concept. I was actually trying to design something like this (hard) before I found them commercially available! Basically it's an object which you hold and can move through all 6 axes (XYZ/PYR) of translation and rotation. It'd be great for the "gripping hand" in a VR environment, or with a variable end-effector trigger as a robotic teleoperation control (really it would then have 7 axes). The downside is, I think they start at about $800 and I haven't got one. I was able to find enough information to write a driver for Linux though (and one may already exist). Someday maybe I'll try to find them again, if they're still available. :-P
They're okay, but having worked in VR for 3 years, and studied for quite a bit before that, the only time I ever found them *useful* (as distinct from cool) was in driving a VR camera around a VR play's set (with specialised software just for that task, and all other tasks mapped to a keyboard-driven interface so they wouldn't require the pointer). For modelling it's just so pointless to take your hand off the tablet or keyboard just to reach for the (honestly rather clumsy) thing. Generalised VR and modelling apps are doing a lot more than simple 3D manipulation with the "pointer", and the spaceball implementations just aren't particularly good for regular mousing about in my experience (too joystick-like). I can imagine ways you could set up a console such that you could use sense-memory to get to the spaceball but it just doesn't seem that critically important given the rather well-thought-out axis manipulator mechansims you see in Maya (or 3DSMax, though they are clumsier in Max4), and the relatively minimal benefit the spaceball (in it's current, rather crude, forms) has for precision modelling. Specialised applications seem to be where it'll stay until there's a compelling application developed. Enjoy, Mike _______________________________________ Mike C. Fletcher Designer, VR Plumber, Coder http://members.rogers.com/mcfletch/
participants (4)
-
Bill Bradley
-
Mike C. Fletcher
-
Seabrook, Richard
-
Terry Hancock