Just ran into Kevin Altis...
Kevin is the main force behind PythonCard, which adapts the wxPython API to make it even more sane. His product features an OpenGL window, but it's just a pass-through the the wx OpenGL window; I don't think PythonCard has a bunch of OpenGL-specific classes. But these could always be added to a custom version (PythonCard for Geometers say). I ran into Kevin at Zupan's, the local food market just now. I was buying provisions for a BBQ (about to happen). Kevin reminded me I haven't submitted any talk to OSCON this year, and now the deadline is passed. I'll probably be a background figure in that event this year, given we're both still in Portland. Kirby
Kirby- Thanks for the (indirect) reminder about this. Portland does sound like a happening place. :-) http://www.unconventionalideas.com/walkability.html http://www.unconventionalideas.com/eastbank.html To run the version in Debian to see what PythonCard-ers have been up to recently, I just now had to upgrade from wx 2.4 to wx 2.6 and that also fixed my flicker problem. :-) So, proof of concept fourteen, with text morphs and no flicker under wx GTK 2.6: http://svn.sourceforge.net/viewcvs.cgi/patapata/PataPata/proof_of_concept_01... Anyway, I am confronted with a series of choice points and paths as I look at how to build a Self-like inspector (which needs GUI widgets including a text editor): * 3D right now? * How to put up interactive widgets (custom vs. wx, tk, pyGame, GLUT, other)? * Whether to keep the programming widgets in the same world window as the application or have them in another window (and/or world) or even another VM interacting over the network? Likely there is some deeply Pythonic (glue-ish and platform agnostic) way to approach these decisions I don't see yet. Still, I feel confronted by the biggest fork in the road -- focusign on the Squeak/Self notion of a consistent world of objects to present to novices and experts alike manipulated in a consistent way (if you drink the Koolaid :-), and an eclectic, throw whatever the paradigm and tool of the moment is at them that gets a specific task done approach (requiring a lot of glue sniffing :-). [***] A fundamental divide, and discussed here before. Clearly, I think the Pythonic approach is more in the latter glue oriented way. But, that perhaps requires certain assumptions about the order things are learned in or presented in, to manage complexity and information overload. Anyway, lots of decisions... So, a good time to revisit how other projects like PythonCard are doing things. If I could wrap a wxWidget in a Morph, I might get a lot of mileage out of that, and PythonCard undoubtedly has an approach to dragging widgets that might be useful (let alone if much of PythonCard could be adapted in a prototype way, since PythonCard is prototypish anyway, even if I initially used just, say, their menu editor). I mentioned long ago the notion of PySqueak and PythonCard perhaps merging somehow; perhaps time to look more deeply at the paradigms behind various approaches and their potential initial compatability. Anyway, still mainly just exploring possiblities at this point. --Paul Fernhout *** Sadly, glue sniffing is a serious problem among school kids apparently, see for example the 20% statistic here: http://emeritus.blogspot.com/2006/04/inhalant-abuse.html Maybe exciting Python-based constructivist virtual worlds (and a better real world emerging out of them?) might help a bit with it. kirby urner wrote:
Kevin is the main force behind PythonCard, which adapts the wxPython API to make it even more sane.
His product features an OpenGL window, but it's just a pass-through the the wx OpenGL window; I don't think PythonCard has a bunch of OpenGL-specific classes. But these could always be added to a custom version (PythonCard for Geometers say).
On 5/14/06, Paul D. Fernhout <pdfernhout@kurtz-fernhout.com> wrote:
Anyway, I am confronted with a series of choice points and paths as I look at how to build a Self-like inspector (which needs GUI widgets including a text editor): * 3D right now? * How to put up interactive widgets (custom vs. wx, tk, pyGame, GLUT, other)? * Whether to keep the programming widgets in the same world window as the application or have them in another window (and/or world) or even another VM interacting over the network? Likely there is some deeply Pythonic (glue-ish and platform agnostic) way to approach these decisions I don't see yet.
From the on-line docs: """ The Python scripting language is fully integrated throughout the game and offers experienced modders a chance to really strut their stuff! People with some programming skills will be able to do things to alter
All good R&D Paul. In my book, these are some of the deep issues, and finding the right combination of libraries is not an easy job. Regardless of the ultimate fate of your particular project, you're adding to your repertoire plus carving out the basis of some kind of graphical framework others might use, and that's a positive thing. Incidentally, I just bought Civilization IV for my daughter [1], so we could give my wife's new computer a workout in the newly rented workspace [2], and she (my daughter) noticed right away that it contains Python bindings of some kind. I haven't researched this yet, but I do find it exciting that a high powered game engine would make Python a part of its commercial front end, even mentioning Python on the box. the game in interesting and extraordinary ways. For instance, all of the game interface screens are exposed to Python, so modders will be able to change the information that's displayed, as well as how it's positioned on the screen. We also use Python to create and generate all of the random map scripts that are included in the game. So, players will now have the ability to add scripted events to the game like automatically generating units when a tile is reached, having specific situations trigger automatic war, or get this, bringing back Civil Wars caused by unrest, Civ II style! """ [ http://www.2kgames.com/civ4/blog_03.htm ] We can use Python to bring back automatic civil wars caused by unrest, woo hoo. How about automatic peace? Maybe its time to cue the Quakers, get 'em more into snake charming. Russ?[3] Kevin is a big gamer by the way, but mostly Mac OS X based. Kirby [1] http://www.2kgames.com/civ4/home.htm [2] http://controlroom.blogspot.com/2006/05/personal-workspace.html [3] Russ is Python programmer I know from OSCON and Quaker-P. But he calls himself an "angry economist" so is he part of the solution or part of the problem? http://angry-economist.russnelson.com/
On 14/05/06, kirby urner <kirby.urner@gmail.com> wrote:
Maybe its time to cue the Quakers, get 'em more into snake charming. Russ?[3]
Concerning FPS games there is "BattleField 2" (from Electronic Arts) which use Python. Firts "news" a year ago: *"""In an interview with Lars Gustavsson of DICE<http://www.fz.se/artiklar/article.php?id=786>, it was mentioned that Battlefield 2's modding tools are going to be delivered with the game, and that the tools are the same ones used to develop the game. The modding language in use is Python, and will support all aspects of the language."""* http://developers.slashdot.org/article.pl?sid=05/02/11/1439230 and now ... :) *"""Unlock Special Weapons (Version 1.0.0 only)* Locate the file "unlocks.py" located in <Drive Letter>Program Files\EA GAMES\Battlefield 2\python\bf2\stats (or whereever you installed Battlefield 2, in the folders \python\bf2\stats), and open "unlocks.py" with Notepad. ... """ http://www.gamespot.com/pc/action/battlefield2/hints.html francois Kevin is a big gamer by the way, but mostly Mac OS X based.
Kirby
[1] http://www.2kgames.com/civ4/home.htm
[2] http://controlroom.blogspot.com/2006/05/personal-workspace.html
[3] Russ is Python programmer I know from OSCON and Quaker-P. But he calls himself an "angry economist" so is he part of the solution or part of the problem? http://angry-economist.russnelson.com/ _______________________________________________ Edu-sig mailing list Edu-sig@python.org http://mail.python.org/mailman/listinfo/edu-sig
On 5/14/06, francois schnell <francois.schnell@gmail.com> wrote:
On 14/05/06, kirby urner <kirby.urner@gmail.com> wrote:
Maybe its time to cue the Quakers, get 'em more into snake charming. Russ?[3]
Concerning FPS games there is "BattleField 2" (from Electronic Arts) which use Python. Firts "news" a year ago:
Great link, thank you! And that Unreal Tournament engine, were those Python bindings native then (allusion to some earlier post)? FPS = first person shooter. SM's CivIV is TPB = third person builder (all those little first persons running around tend to shoot a lot though, especially after discovering gunpowder, woo hoo). Or you could say TPB = FPG (first person god), and god *could* be a shooter. But "god versus humans" in a killing capacity tends to be no fun. They're so good at killing themselves as it is. The challenge is getting them to do anything constructive, hence the popularity of TPBs. Kirby
On 14/05/06, kirby urner <kirby.urner@gmail.com> wrote:
But "god versus humans" in a killing capacity tends to be no fun. They're so good at killing themselves as it is. The challenge is getting them to do anything constructive, hence the popularity of TPBs.
As our ancestors were hunters maybe its why FPS are also quiet popular with all kids even before videogames with plastic guns and bows (a kind of "hunting" gene remaining somewhere ? an inherited behavior or just a media/TV influence ? ). Personally I prefer to see kids practicing their old "hunting skills" in virtual worlds than becoming real hunters killing real animals just for fun. Also if they want to "survive" longer they'll have to analyse and take advantage of the "geometry" of the map they're in : you just don't run and shout in an open space like a maniac (at least in "realistic" FPS) . All that said FPS is not my cup of tea except a part for the game "Unreal Tournament 2004" that I bought because there were Linux logo on the box :). With the Linux NVDIA drivers I could at least impress my friend about 3D games on Linux. The funny thing about UT2004 is that the most popular way of playing it is the "onslaught" mod which is collaborative and tactic in the same time : you need to work in team to control vital nods up to the main enemy core, if the team doesn't collaborate and just shoot everywhere the mission is over quickly. To collaborate you use the microphone (or the chat if don't have a microphone) and *that* was the unexpected *social part*, I've learn a lot about life in russia, japan, UK, etc while practicing an old "hunting" gene with 7 to 77 years old humans :) http://en.wikipedia.org/wiki/Unreal_Tournament_2004 Concerning TPB I think they are really interesting since you can learn about history, economics, strategy while playing. A shame that it doesn't fit well the console market with their limited pad and TV-screen. francois Kirby
On 5/14/06, francois schnell <francois.schnell@gmail.com> wrote:
Personally I prefer to see kids practicing their old "hunting skills" in virtual worlds than becoming real hunters killing real animals just for fun.
I'm for that if the hunting is purely for sport and not survival. Let 'em rent 'Deer Hunter' for the X-box or something. Maybe put a real shoulder kick in the rifle somehow? Holodeck stuff. Oh, and let players opt to be the deer instead of the hunter. And that point (when switching animals), many players would ask for a stronger predator avatars (gentle deer were never that hard to kill, were for rank beginners even in the Stone Age). How 'bout give us something to really scare those humans silly. ETs?
All that said FPS is not my cup of tea except a part for the game "Unreal Tournament 2004" that I bought because there were Linux logo on the box :). With the Linux NVDIA drivers I could at least impress my friend about 3D games on Linux.
'Half Life 2' for me. It's just so good at pushing to a fascist extreme in the opening minutes -- reminds everyone of why we fight (except some among us secretly yearn to play quisling for the ETs -- as revealed by on target personality tests).
Concerning TPB I think they are really interesting since you can learn about history, economics, strategy while playing. A shame that it doesn't fit well the console market with their limited pad and TV-screen.
francois
Good point. But then, I worry about such limited APIs in general, for humans I mean. Sure, a car is X-Box simple to operate and driving is fun, sometimes x-tremely so. But if that's *all* you know, you haven't really exploited some of the deeper powers of your nervous system. Like hey, you're human this time, so why not milk that for what it's worth? Like try a real keyboard sometime, combined with reading some challenging writing, like William Shakespeare or Calvino or someone, or maybe learn Chinese. Kirby
kirby urner wrote:
All good R&D Paul. In my book, these are some of the deep issues, and finding the right combination of libraries is not an easy job. Regardless of the ultimate fate of your particular project, you're adding to your repertoire plus carving out the basis of some kind of graphical framework others might use, and that's a positive thing.
Thanks for the kind words. Also, I am forced to wrestle with the issue of what are my specific subgoals here (thinking of a PySqueak which is easy for beginners to use and experts want to use as well)? They sort of are, roughly speaking: * to have Self prototypes in Python as an option (check, though inefficient :-), * to have a Self GUI in Python (brooding over), * to port our 3D educational software to Python and this system (half done, the translating to Python part, but need to settle on widgets), * and to make Python development with running systems as interactive and fun to use as that of a modern Smalltalk (just scratching at the network debugging approach outlined earlier, but also already have the reloader window I use with Jython), * 3D simulated worlds in Python (less importance now to me, but of potential long term interest in terms of simulation). I've been trying out the latest version of Croquet some more as I ponder this 3D and widget issue. I think I am getting past the Wow factor -- though it still remains pretty neat. Of course, as always, I was impressed more by watching the Alan Kay presentation videos like of drawing a fish in 2D and having software make it 3D and swim away than by using it myself. http://video.google.com/videoplay?docid=-9055536763288165825 Ignoring some walkbacks and exceptions when starting some of the demos, par for the course for my experience Squeaking, what I am left with is the same feeling I get when I watch pretty good 3D modeling of people -- which is sometimes a bit disconcerting (in a bad way). :-) To explain, when you see cartoons of people, or mechanically rendered sprites of people with no serious attempts at skin texture or hair luster, you think, wow this is cool, and it is fun to look at. In my mind, that is something like using 2D apps or the command line. It does cool things its own way, and you can respect that. You get the immediacy people here talk about, and having no (or few) expectations about what it should do base don reality, nothing feels wrong (although it may still be confusing). But, when you look at images of people rendered in 3D that is pretty good (say 99%), they often look like corpses, because they are good enough to set in motion a whole bunch of reality based expectations we have for how other people should look, while not meeting them. Slightly drab hair? Maybe they are sick? Skin that looks like rubber? A sign of death. For more on this, see: "Realistic Human Graphics Look Creepy" http://games.slashdot.org/article.pl?sid=04/06/10/1327236 http://www.slate.com/id/2102086 [article discussed] So, by analogy, when you make fairly realistic 3D environments, but not 100% realistic, you may be worse off for trying, because you raise expectations you can not fulfill, and the differences make people unconsciously feel uneasy, creepy, or frustrated. To be clear, I'm trying to make an analogy here; Croquet as a 1.0 beta demo does not attempt realistic human avatars, just a silly bunny rabbit avatar which looks cute. My point is about the 3D world aspect versus the real world. So, in this case, with Croquet and 3D, perhaps more realism is actually a worse experience. And perhaps that unconsciously drives some of Alan Kay's interest in better 3D rendering? Yes, in theory you can get really good, but until you get there (when? 2020? 2040?), the results are sometimes almost more disappointing the closer you get to realism. I think Croquet may be facing that in the 3D realm. It activates expectations, like moving in 3D, but then you are confronted with how awkward it is to use the mouse to virtually "walk". You see a cube of blocks that invites you to take one, but when you put it back, there is no gravity to make it drop back into place, so it just oddly floats there. A slip of the mouse and the rabbit falls off the green carpet, to where? And it takes a bit of slogging to get back to anywhere interesting. I can see portals, but I can't shortcut click to them; I need to proceed slowly to them, and if I could click, and I suppose there is some trick perhaps, then I break the experience. Obviously, specific popular 3D games solve many of these problems within the game in a fun "suspension of disbelief" way, and can be entertaining for twenty hours or so, and even leave you wanting more, e.g. a recent favorite of ours: "Beyond Good and Evil" http://www.gamespot.com/ps2/adventure/beyondgoodevil/ but, they do not seek to be the comprehensive life-time-of-use experiences that Croquet is implicitly claiming to be for right now (or at least, soon). So, I wonder if better 3D graphics will really solve the problem in the near future. Short of Sutherland's "The Ultimate Display" (same as the Star Trek Holodeck), 3D worlds can be pretty dissatisfying or unfulfilling by themselves (ignoring their use as settings for games) if for no other reason (even if 100% visually realistic) than lack of force feedback or smell or air movement. Now, I'll say they can be useful, just like 2D white boards are, or conference calls are, and they can be fun (like many people find 3D games fun), but, ultimately, aside from any addicting game like aspects ("Evercrack") the 3D aspect by itself isn't a long term win (and it seems like the implicit notion of the Croquet work is that Croquet will be where we all spend most of our computer time). And the closer Croquet or similar systems get to reality, in some sense the more disappointing they are (up until we have the Holodeck or direct neural stimulation or something like that). Better to be in reality, I think. And stick with 2D or 3D graphics geared more towards efficient creation and collaboration than semi-realistic virtual creation and collaboration. There's another thing that bothers me with the 3D world approach, and that is, it tends to run counter to the notion of moving fluidly between levels of abstraction, where you can see multiple representations of the same data. Our Garden Simulator (circa 1997) http://www.gardenwithinsight.com/ [See the second picture from the top on the right] tried to lead kids (or adults) from working in a relatively non-abstract garden setting onwards to using a level of abstraction using a numerical and graphical browser to visualize current state and change it, and then from that on to using graphs to visualize state over time, and so moving to higher levels of abstraction at each level. You could do that in a 3D Croquet world, say, by havinga 3D garden and a sheet of virtual graph paper, but it seems to me the 3D world is getting in the way a bit somehow, when you could just as well have a flat 2D window on your screen with a graph next to a 3D widnow of the garden. Granted, having a separate window for a 2D thing violates the paradigm Croquet is working towards, where multiple people can see what each other are looking at, so perhaps I need to understand more if that is really compelling and useful. But, on the other hand, even knowing where an avatar is pointing can't tell you what someone is thinking about. One big value of computers is that they can provide alternative ways to do things and look at things than is possible in the "real world". [One reason command line diehards like the command line so much is it makes some things easier to do with a mini-language than with a 2D graphical interface, like change a bunch of files at once.] So, while the 3D Croquet interface does not demand giving up various levels of abstraction, it would seem to me it sort of encourages it. But that is not an opinion from using it much in practice, so perhaps in practice may be different. Now this isn't to dismiss Croquet as a valuable experiment or as innovation. It's a good thing to have tried. And to keep trying with. And Python as a platform may well benefit from having some of this technology. It's just to consider it in the light of lessons learned and where to go next (for me or Python)? But then you are left with the question -- how should people of varying abilities best interact with the computer in general and for various tasks. And, while I can see the value in a Croquet or ActiveWorlds 3D approach in some situations (I can see the potential value of being able to see where others are looking in the virtual world when collaborating, for example), I think there remains a lot of value in other interface paradigms, including both plain 2D widgets and also using a 3D window surrounded by 2D controls. (Obviously, whether OpenGL might render all that 2D stuff as well as the 3D stuff is just an implementation point in some sense.) So, in looking at the strengths and weaknesses of the Squeak/Croquet hybrid, derived from Self's Morphic and 2D collaborative model (and other things), I am again back to thinking that supporting collaboration in a 2D space (and ideally using prototypes) is potentially still a very useful thing, and a valuable first step. And so are any other approaches towards providing the option of doing development with Python in a way more like development under Smalltalk. And when six years ago I first brought up Squeak here, it was before Croquet, and in the context of these basic thing, plus some 2D learning applications built on top of them (more eToy-ish and HyperCard-ish and Virtual-Robot-programming-ish) Anyway, having used Smalltalk for years (and ZetaLisp+Flavors before that), I am still confronted on a daily basis with the knowledge that it is productive and fun and empowering to develop with the level of interactivity Smalltalk (or the Symbolics) provides as an environment, and no Python IDE I have tried can deliver that (parts yes, the whole thing, no). Yet, I am also left with the reality that Python has gained acceptance because it is closer to C and BASIC in appearance, and because of its ability to play nice with other systems (especially ones written in C, or now, Java), as well as other factors, including that the license works out well for today's economic structures. So, on the one hand, I know what is possible, but on the other, I find myself often not able to make use of those systems in practice, which is what drives my interest on making such additions to Python more than anything.
Incidentally, I just bought Civilization IV for my daughter [1], so we could give my wife's new computer a workout in the newly rented workspace [2], and she (my daughter) noticed right away that it contains Python bindings of some kind. I haven't researched this yet, but I do find it exciting that a high powered game engine would make Python a part of its commercial front end, even mentioning Python on the box.
Your point is exactly what gives me pause about the Squeak / Croquet approach, and why Python has had more widespread commercial success. Scripting such things is just left out as an encouraged possibility with Squeak / Croquet (anything is possible with Squeak of course; it's a matter of focus and clutter). Squeak and Croquet are, as has been said here, sort of anti-collaborative. Which is ironic, since the whole point of Croquet is to be collaborative. [That was brought up years ago on this list.] Or perhaps, collaborative only on Croquet's terms? I liked your(?) proposal a way back on how what we need are standards for communicating between virtual world interfaces, so they could be implemented in various ways. Still, as Alan Kay has said, "any standard of more than three lines is ambiguous", and so you really need at least reference implementations, which Croquet is. Yet, having said that, I am also thinking that given Python's success as gluing things together, and how all GUI toolkits are somewhat closed systems, see for example: "Why Gtk/Qt/WxWidgets... are bad" http://www.pygame.org/wiki/gui I don't think it is unreasonable to argue for making another widget set (perhaps heavily copied from an existing one, like the pure Python on SDL OcempGUI) http://www.pygame.org/projects/9/125/ if there is a compelling and overwhelming reason to in terms of creating a good learning or developing environment in Python supporting a different paradigm of development ("modeless" versus "edit/run", and "prototype-oriented" instead of "class-oriented"). That system can still be used to call existing (non GUI) libraries through Python. I'm not specifically making that argument right now, just saying, I think one could make it. For example, my very first Prototype proof of concept used TK widgets which were inherited from by prototypes. But I found when I copied one, that the widget was not copied, since internally two prototypes both point to the same proxied TK widget (as it was not copied). That's the sort of problem a pure Python solution might work around better (where a copied prototype would be a different GUI object). Not to say one might not be able to clone the TK widget, or a wx widget, just that it is more machinery, and perhaps some other unknown difficulties may exist in retrieving state from live widgets implemented in C/C++. But it is still hard to look at all the nice widgets in, say, the latest wx demo and think there is any point to writing yet another widget set... With all that entails... Incidentally, I spent some time with the latest Debian versions of PythonCard and found a few issues from the standpoint of my using it. One is the paradigm -- mostly edit and run. Yes, you can change widget information while it is running, but that does not seem how it is set up. (HyperCard had a run/edit model too). Also, under GNU/Linux at least, you could not drag a button or list after you placed it -- they absorbed the clicks. Other components could be dragged. So, an inconsistency. Not sure if this is easily fixable with live wx components. In any case, that paradigm mismatch issue is potentially a large one. --Paul Fernhout
Hi Paul -- You bring up a lot of important issues, I think the main one being: how real do we really need it to be? And leading up to that: what parameters are we talking about, when measuring the degree of realism? I think an important language game to hold in mind as relevant is that of chess. The strategies don't change just because the pieces get more detailed and animated. P-K4 is still P-K4, even if the King's lip is seen to quiver (hey, that looks like Johnny Depp!). So in some sense all that texturing is extraneous in chess. And yet artisans lovingly carve chess sets out of every material (or build them in OpenGL), using a vast assortment of designs. Why? Because there's an artistic dimension orthogonal to the ostensive goal of the game (to checkmate your opponent). I get lost in this orthogonal dimension playing multi-user Quake or Doom. I'll just start admiring the artwork, smiling at all the skillfully rendered skulls, nevermind the loser pumping me full of bullets just now. They say chess is a war game, and this spectrum between spare iconography, versus vividly detailed scenes, is played out in the military sphere big time. Battlemaps, perhaps laid out as tiles, perhaps hexagonal, contrast with a more on-the-ground first person presence, sighting down the barrel of a gun or whatever. Computer games will often emulate both aesthetics, as does ordinary fiction. We're accustomed to a mix of first and third person. When it comes to highly detailed renderings, keep in mind why computer games always seem tackier, less realistic, than state of the art computer-generated movies: the latter have all the CPU time they need for each and every frame whereas the former have to keep a human player entertained in real time, a whole different ball game. Sustaining a high enough frame rate is what applications like Croquet and Civ IV have to manage (many per second), whereas frames of 'Shrek' might each take an hour in some rendering farm. It's like the difference between having hours to rehearse, versus spontaneous improv. Which isn't to say the skill sets aren't overlapping. Practiced actors are also better at improv on average. Anyway, I think the degree of realism needed, and in what dimensions, is very application-specific. That's why it's dangerous for the OS to weigh in too heavily with a graphical view. I like the stronger dichotomy between the GUI and kernel in Linux. Windows users get confused by this, thinking GUI and OS must be synonymous. For the same reason, I like it that Python-the-language isn't too vested in any one GUI solution. It's a lexical controller of an abstract model, via some API, not a view. I think we should keep it that way. We're not trying to come up with the one set of graphical motifs that "defines Python" -- because Python is ultimately lexical, not graphical. Python's "look and feel" has to do with syntax, not the shape or shade of the widgets. Python isn't Java, and no, we don't need our own version of Swing. So the degree and type of realism is application specific, is not defined by the OS, is not defined by Python. Python, like the OS, is fairly agnostic about the user's world, its look and feel. I don't imagine oil executives want cartoony avatars on screen, but having all present gaze upon a shared view of some new drilling area, complete with high def representations of rigs, pipelines (current and projected), equipment and crews, makes the meeting more effective and productive. Oil execs are more like generals pouring over a battle map. Their vista is more game-like than ultra real, although shifting to overhead photography ala Google Earth is always an option. Kirby PS: I'm posting from my borrowed Edubuntu box for a change. Dave Fabik (appears in my blogs) lent me this Buffalo ethernet converter, which goes from wireless to a regular ethernet card. It's browser configured. This way, you don't need to mess with some PCI wireless card or other gizmo needing a Linux-specific driver, provided your eth0 NIC is already operational.
kirby urner wrote:
Computer games will often emulate both aesthetics, as does ordinary fiction. We're accustomed to a mix of first and third person.
You put it perfectly. And I think that is what bothers me about Croquet in a sense as a proposal for "the" interface (especially for kids). The Croquet approach chooses to focus all interaction with the system through a first person 3D world on the computer display, with third person applications perhaps within that (e.g. a simulated sheet of 2D graph paper in that 3D world). I am more accustomed to think of the computer as displaying "third person" information, and then sometimes I'm willing to suspend disbelief for a first person interface within a window on that third person desktop. Even if the window takes up the entire screen (or all screens :-), I know the third person perspective lies underneath it if I stop that application. Ignoring the R&D aspects of Croquet (it's a good effort, breaks new ground, etc.), right now, I don't think computers (and interface technology) are not going to be close enough to 100% reality to make it worthwhile to channel all our interactions with them *first* through a first person 3D view. Certainly, if we get the Holodeck (with smell and force feedback) I'll revise that opinion. :-) And so also, from a perspective of priorities of making Squeak-like or Self-like things happen using Python in a learning (or developer) context, I think the Croquet aspects are less important than the interactive 2D tools aspects. By the way, the seventeenth proof of concept of Prototypes in Python with the rudiments of a tree browser for 2D Morphs (though the browser is in wx): http://svn.sourceforge.net/viewcvs.cgi/patapata/PataPata/proof_of_concept_01...
Anyway, I think the degree of realism needed, and in what dimensions, is very application-specific. That's why it's dangerous for the OS to weigh in too heavily with a graphical view. [snip] I don't imagine oil executives want cartoony avatars on screen, but having all present gaze upon a shared view of some new drilling area, complete with high def representations of rigs, pipelines (current and projected), equipment and crews, makes the meeting more effective and productive.
You're absolutely right here. I think 3D is a great thing for visualizing data to understand issues and help solve problems, and a great way to communicate whatever the 3D representation chosen (realistic to whatever degree or even totally schematic or even mathematically abstract). And I can even see the value of ActiveWorlds type 3D with avatars for occasional meeting or even specific projects. To be clear, and to agree here, I have nothing against 3D. I was playing with VTK and the Python bindings for it the other day as on possibility for an underlying toolkit for a PySqueak, and can see the real value of it for looking at scientific data in general using Python. And I can see further how working in 3D could be very motivating to learn programming (if the package makes that easy to do, such as VPython or Alice attempts). And I think Croquet is a wonderful attempt to make it easy to do 3D collaborative applications. But, in considering Croquet as the evolution of Squeak as a kids programming environment I see an implicit intent based on my interpretation of Alan Kay's intent (perhaps in error?) that Croquet is what we should all use all the time when we are at the computer. I think that implicit intent is what bothers me, because, perhaps conditioned by experience (?), I think approaching the computer (as it exist now and for the next ten to twenty years) expecting a third person experience is better (i.e. more flexible, less disappointing, more effective for most tasks) than approaching it expecting a first person experience (where that first person experience will be missing things for the foreseeable future). Anyway, a philosophical point of interface design for me to muse over some more. --Paul Fernhout
On 5/17/06, Paul D. Fernhout <pdfernhout@kurtz-fernhout.com> wrote: <<SNIP>>
ground, etc.), right now, I don't think computers (and interface technology) are not going to be close enough to 100% reality to make it worthwhile to channel all our interactions with them *first* through a first person 3D view.
I tend to agree with all of the above. However, between Croquet and a completely flat GUI, there's a spectrum, and we're seeing signs of Cartesian 3Dness creeping in, but not all that overwhelmingly, e.g. six desktops become the faces of a cube, but you only zoom out and rotate the cube when flipping to a different desktop (doesn't happen all that often unless you're by nature frenetic): http://worldgame.blogspot.com/2006/05/blabbing-on-edu-sig.html Even more subtly, windows have been garnering depth cues for years, plus gradually straying from the purely rectangular: http://worldgame.blogspot.com/2004/12/interface-designs.html (I've this curvilinear motif on CBS News too -- TV and computer screen aesthetics interpenetrate in large degree).
You're absolutely right here. I think 3D is a great thing for visualizing data to understand issues and help solve problems, and a great way to communicate whatever the 3D representation chosen (realistic to whatever degree or even totally schematic or even mathematically abstract). And I
Games! World Game is less apologetic for using the word "game" because it's more about happy stuff than what got plotted on the old War Game boards. We're more light-hearted in our approach to global simulations, though just as into cross-checking and verifying. Vigilance against sloppy scholarship remains of paramount importance.
attempt to make it easy to do 3D collaborative applications. But, in considering Croquet as the evolution of Squeak as a kids programming environment I see an implicit intent based on my interpretation of Alan Kay's intent (perhaps in error?) that Croquet is what we should all use all the time when we are at the computer. I think that implicit intent is
That will never happen, except sure, maybe *some* people will eat their own dog food in this regard. In general though, people are too curious to ever alight on one generic omniembracing containment system. That's what reality is for. Within reality, all GUIs are special case. That being said, let's use Croquet sometimes, why the hell not? I'd especially like it on a much bigger screen with a projector, and with less of an Alice in Wonderland flavor (I've already got Alice, the computer game, which twists the scenario into something darker and more gothic -- closer to Batman in flavor, and therefore more comfortingly all-American in my book (but who wants something that twisted for an interface? (we should at least be able to "skin" Croquet in some dimensions, according to personal taste, if it really purports to be more of an operating system and less of an application))).
what bothers me, because, perhaps conditioned by experience (?), I think approaching the computer (as it exist now and for the next ten to twenty years) expecting a third person experience is better (i.e. more flexible, less disappointing, more effective for most tasks) than approaching it expecting a first person experience (where that first person experience will be missing things for the foreseeable future). Anyway, a philosophical point of interface design for me to muse over some more.
--Paul Fernhout
Some will experiment with other shapes of screen. Imagine the challenge of fitting a GUI to an hexagonal frame. Very Klingon. Very Dymaxion. Kirby PS: you're mentioned in my blog too by the way: http://mybizmo.blogspot.com/2006/05/home-schooling.html
kirby urner wrote:
That will never happen, except sure, maybe *some* people will eat their own dog food in this regard. In general though, people are too curious to ever alight on one generic omniembracing containment system. That's what reality is for. Within reality, all GUIs are special case.
Good point. Funny thing is, as I think about it, that Croquet to some extent perhaps repeats the thought of the original Macintosh Desktop -- that is, give people a desktop they are comfortable with and they can easily use it. The problem is, that things didn't always do what people expected, and the desktop metaphor can be pretty limiting. By the way, as I though about your use of "first person" (I) and "third person" (he/she/it) for interface types (exemplified by 3D shooter vs. 2d map with markers) I realized that for completeness one should consider "second person" (you) interfaces, which I would think would be more like conversations, e.g. ("Computer, you should really print out my reports, please." :-). [I have worked somewhat in the past on such speech driven second person interfaces.] Likely there might be other versions related to tense or indirect reporting of things. Some more ideas here: http://en.wikipedia.org/wiki/Grammatical_person From there: "The grammar of some languages divide the semantic space into more than three persons. The extra categories may be termed fourth person, fifth person, etc. Such terms are not absolute but can refer depending on context to any of several phenomena."
That being said, let's use Croquet sometimes, why the hell not? I'd especially like it on a much bigger screen with a projector, and with less of an Alice in Wonderland flavor (I've already got Alice, the computer game, which twists the scenario into something darker and more gothic -- closer to Batman in flavor, and therefore more comfortingly all-American in my book (but who wants something that twisted for an interface? (we should at least be able to "skin" Croquet in some dimensions, according to personal taste, if it really purports to be more of an operating system and less of an application))).
Interesting idea...
Some will experiment with other shapes of screen. Imagine the challenge of fitting a GUI to an hexagonal frame. Very Klingon. Very Dymaxion.
Definitely the promise of making systems you can "glue" together. By the way, no hexagons yet, but this project http://sourceforge.net/projects/patapata is up to "proof_of_concept_021.py" http://svn.sourceforge.net/viewcvs.cgi/patapata/PataPata/proof_of_concept_02... which finally has gotten to the point where I can edit what morphs do in a very limited sense. You have to put your code in like this: PrototypeMethod(self, "def grow(self): print 'grow changed'\n") and you can only modify existing methods. These proof of concepts have brought home hard one major language impedance mismatch issues trying to use Python to be like "Self" or "Smalltalk": Python is oriented around supply a function and then calling it, whereas Smalltalk and Self are oriented around message passing. Thus, implementing "doesNotUnderstand" is possible in Squeak/Self, as is executing code on simple accesses (e.g. "self spam"), whereas under Python I can't assume what someone retrieving a function or variable plans to do with it. That also leads to the issue of dynamically wrapping methods and the overhead for it in memory and computation in Python. Also, I need to think through initialization better, because otherwise prototypes which are filling the role of classes have their lists or dictionaries shared across all instances, whereas I really want those per item. I have a workaround in place but I don't like it. And that gets back to a problem with my very first prototype using TK, where I copied a prototype which inherited from a TK widget, but the widget itself was not duplicated. I guess I could duplicate that widget as part of a special init or copy constructor. If I can manage that trick, then I could have a hope of embedding TK or wx widgets into a world of prototypes. I've continued to look at other languages in terms of how the approach some of these issues, and came across this little idea: http://www.paulgraham.com/popular.html ""The best writing is rewriting," wrote E. B. White. Every good writer knows this, and it's true for software too. The most important part of design is redesign. Programming languages, especially, don't get redesigned enough. To write good software you must simultaneously keep two opposing ideas in your head. You need the young hacker's naive faith in his abilities, and at the same time the veteran's skepticism. You have to be able to think how hard can it be? with one half of your brain while thinking it will never work with the other. The trick is to realize that there's no real contradiction here. You want to be optimistic and skeptical about two different things. You have to be optimistic about the possibility of solving the problem, but skeptical about the value of whatever solution you've got so far. People who do good work often think that whatever they're working on is no good. Others see what they've done and are full of wonder, but the creator is full of worry. This pattern is no coincidence: it is the worry that made the work good. If you can keep hope and worry balanced, they will drive a project forward the same way your two legs drive a bicycle forward. In the first phase of the two-cycle innovation engine, you work furiously on some problem, inspired by your confidence that you'll be able to solve it. In the second phase, you look at what you've done in the cold light of morning, and see all its flaws very clearly. But as long as your critical spirit doesn't outweigh your hope, you'll be able to look at your admittedly incomplete system, and think, how hard can it be to get the rest of the way?, thereby continuing the cycle. It's tricky to keep the two forces balanced. In young hackers, optimism predominates. They produce something, are convinced it's great, and never improve it. In old hackers, skepticism predominates, and they won't even dare to take on ambitious projects. " Anyway, I broke the versions out into separate incremental files so people could easily follow what I was doing and perhaps revisit assumptions at earlier points. I've reached the point through where for more complex prototypes it may be hard to keep them in one file (especially if I started testing writing and loading Python files to save widget state) and at that point I'd probably start using SVN in a more typical way. (And at some poitn I'll probably move all those files into a directory and those previous links may break, but the code shoudl still be findable from the main project SVN page.) http://svn.sourceforge.net/viewcvs.cgi/patapata/PataPata/ At this point, with a little handwaving about how easy 3D is in Python :-) I'm tempted to just do the politician's trick of declaring "victory" (at showing Python doing Self-like and Squeak-like and Morph-like things) and going home. :-) [Well I'm already at home, so that's just a turn of speech, but often good advice anyway. :-)] Realistically, the last prototype is still quite a ways from where anyone would want to use it to do software development (and the inspector is still wx and not native), but it does show how (ignoring efficiency) one can create a graphical world of prototypes and interact with it dynamically using Python. So, it is a tiny fraction of 3D Alice in Python of almost ten years ago. :-) http://www.cs.cmu.edu/~stage3/publications/95/journals/IEEEcomputer/CGandA/p... You can see pessimism is outweighing optimism at this point in my mind. :-) Anyway, what I can say that is good about it is that *if* people want to talk about or demonstrate prototype based languages using Python, this is a little example consisting of a single file (~20K) that is a starting point for discussion of "prototype vs. class-based" programming,
PS: you're mentioned in my blog too by the way: http://mybizmo.blogspot.com/2006/05/home-schooling.html
Thanks for the link. IMHO, the reality is that (almost) all learning is really learned originated, it is just that it sometimes also happens in the classroom. :-) --Paul Fernhout
participants (3)
-
francois schnell
-
kirby urner
-
Paul D. Fernhout