
Good thinking Paul, about the balance between optimism and skepticism. We do need to revisit old designs, and admit their weaknesses. USAers still seem in denial about their infrastructure. A lot of it's still good, but was meant to be maintained, not neglected. My hope is that infusions of data rich content will help people feel more in the driver's seat, in terms of at least tracking what's going on. Python-the-language is helping with this (at Google, other places). Then we'll need a lot more teams in the field, tackling specifics (again, the overview versus 1st person motif, familiar from war game simulations, and now I'm saying in world game as well). 2nd person enters with this sense of being on a team (we collaborate). Python has a role to play here as well. Perhaps we're somewhat off on a tangent in discussing simply the Dness of the thing (2D vs. 3D vs. 4D or whatever). Perhaps more to the point are the modes we access: (1) a creative brainstorming mode, more playful and open to novelty (2) a getting down to business, activist phase wherein we actually implement and make happen (3) an awareness-of-assets phase, an appreciation for what's already at our disposal, what's already invented and downloadable and so on. I summarize these as the BE, DO and HAVE modes or phases, and did an early collage about that under the heading of general systems theory (GST): http://www.grunch.net/synergetics/gst1.html In CS terms, we have design phase, coding phase, and legacy software and hardware. The legacy stuff needn't always be approached as burdensome though -- that's not the creative attitude. Good designs recycle and reuse, have a role for retro. FORTRAN still offers some dynamite libraries. We simply wrap them in our newer OO APIs. The strength of Open Source is we allow, encourage and foment copying, as the precursor to studying and improving. We celebrate the relevance of Having (over simply Being and Doing). Anyway, when it comes to Python in Education, I like to think of the ways in which Python might be able to amplify and improve one's experience in all three of these modes. Whether that involves 2D, 3D or whatever-D aesthetics is a somewhat orthogonal and I dare say somewhat irrelevant issue. We want to get work done, and so we need to play, we need to make happen, and we need to keep track and recycle. Python has those capabilities: it reads like pseudo-code and so is suggestive and encouraging of playful exploration; in production settings it generally does a reliable job of handling real world responsibilities. It may be in this last area (of Having) that Python is currently weakest. Like, Perl's CPAN is the stronger HAVE technology. But I'm not despairing. There's lots we can do ala the mythical Vaults of Parnassus, to keep our vast and growing inventory of assets accessible and relevant to future selves. Kirby

Kirby- All good points. And we definitely need a better infrastructure given continual population growth -- especially one doing more with less, and doing it in a more resilient and sustainable way. I think that is also one of the pushes towards prototypes in "Self" -- the Self authors describe an attempt to address the fragile base class problem and make software both more flexible and more resilient (as well as more consistent). http://en.wikipedia.org/wiki/Self_programming_language And, along those lines, more updates on the PataPata (PySqueak) project: http://sourceforge.net/projects/patapata There is now a screenshot of the 25th proof of concept at: http://sourceforge.net/project/screenshots.php?group_id=165910 That should give a sense of what the whole thing does so far as the demo is only one window (although there are also popup menus you can't see). Note: Under Gnome, you need to change Window manager preferences so that ALT does not drag windows (choose the Windows key instead). I have not tested it under other platforms, though wx (2.6) should run on Mac and Win. You may notice by looking in the browser (yellow part) of the screenshot that I made a design decision to implement multiple-inheritance (of traits) but using a list of name strings rather than direct pointers, so any prototype you want to use as a parent needs to be given a name under "world.worldGlobals[]". Personally, I find the most important part of programming to be documenting *intent*, and so I thought the explicit naming requirement was OK. I've always been dismayed at the potential of a Self-like approach to create a real mess otherwise. (Forth us another language system that fails at documenting intent too, though it does most everything else right.) These named prototypes take the place of classes, but without the class-side / instance-side divide of Smalltalk. The twenty fifth proof of concept now can load and store Python source files of prototypes (essentially by just writing a "define_prototypes()" function and exec-ing it back in). There are some limits that embedded Python objects you want to store must implement a useful-for-reconstituting __repr__ string (though it does handle nested dicts and lists out of the box). You can finally add and remove prototype fields from the inspector, create new methods and edit them, and it now has an interactive shell plus some other polish. See the code at: http://svn.sourceforge.net/viewcvs.cgi/patapata/PataPata/proof_of_concept_02... An example of the Python source file it can write and read (demonstrating the idea discussed a couple weeks back of using Python source instead of pickle files to save widget state) is here: http://svn.sourceforge.net/viewcvs.cgi/patapata/PataPata/some_output_prototy... You don't need it to run the P.O.C. 25 demo though; that is still self-contained. [This example file is missing a few world methods I added afterwards.] The system is still limited to two widgets though (Morph and TextMorph). Adding more widgets is a bigger undertaking and realistically will likely entail splitting the prototype into multiple files, which I have been avoiding. Also, functions defined in the interactive shell (as opposed to defined in a file or defined using the inspector) don't have source code available, which would be a problem when you save them. Not sure how to fix that easily without changing wx's shell.py. But that's not really what the shell is for as opposed to interactive testing or importing other Python files, so it's a minor problem for now. For now, best to stick to defining new prototype methods using the inspector or source files. The system may finally be at the point where (in theory) one could use only the system to improve itself, though in practice messing with the wx-based GUI scaffolding by changing the main application source file would still often be easier. Everything is still in one Python file for ease of prototyping and people trying it, but at ~49K it's straining to break loose into various files and who knows what else. :-) Anyway, I definitely feel it has reached the point where it is a useful tool for learning about prototype-based-programming in a Python context. Not my eventual goal, but a useful byproduct. Amazing what is easily possible by building on top of Python and not getting sidetracked (as I usually do) by writing a new interpreter or messing with the syntax (at least, not at first. :-) Well, that and throwing performance and morph-loading-security to the winds. :-) [It does point out the need for a Python VM with security flags one could set like Java has (e.g. can't write to most files, can't open most sockets, etc.), although I guess you could run under Jython to get some of that.] Still no 3D yet, though. :-( Say, Kirby, speaking of "data rich content", can you suggest any Bucky Fuller-ish or synergetics-type graphics or clip art under a free-as-in-freedom license somewhere on the web I could use to spice this demo up when I move beyond having everything in one file? Ideal would be graphics a demo could load from the web as a URL to keep everything in one file (though I probably should put them on another web server to be polite, though I don't expect anybody to really use it anytime soon). Even just graphics like the individual source items for the web page you linked to: http://www.grunch.net/synergetics/gst1.html might be snazzy to just drag around as Morphs. Or perhaps I could try to use them to make a HyperCard-like stack with the proof-of-concept as a test and to motivate putting in related supporting features (including perhaps a Bitmap-from-URL-Morph and a Card-Morph and a Stack-Of-Cards-Morph). Also, if you have any pointers to free-as-in-freedom Bucky-related content I could throw in such a stack, that would be nice too. I'm mainly just looking for something that makes a nice demo with a dozen cards worth of interesting educational content and graphics. I could go with gardening (I've lots of stuff on that) but I thought Bucky stuff might be more fun right now. --Paul Fernhout kirby urner wrote:
Good thinking Paul, about the balance between optimism and skepticism. We do need to revisit old designs, and admit their weaknesses.
USAers still seem in denial about their infrastructure. A lot of it's still good, but was meant to be maintained, not neglected.
My hope is that infusions of data rich content will help people feel more in the driver's seat, in terms of at least tracking what's going on. Python-the-language is helping with this (at Google, other places).
[snip]
We want to get work done, and so we need to play, we need to make happen, and we need to keep track and recycle.
Python has those capabilities: it reads like pseudo-code and so is suggestive and encouraging of playful exploration; in production settings it generally does a reliable job of handling real world responsibilities. [snip]

On 5/22/06, Paul D. Fernhout <pdfernhout@kurtz-fernhout.com> wrote:
Also, if you have any pointers to free-as-in-freedom Bucky-related content I could throw in such a stack, that would be nice too. I'm mainly just looking for something that makes a nice demo with a dozen cards worth of interesting educational content and graphics. I could go with gardening (I've lots of stuff on that) but I thought Bucky stuff might be more fun right now.
Hi Paul, I have a feeling you're about to get more than you bargained for. Kirby is a kind of Ground Zero for Bucky resources on the web--I knew of him from my readings on Synergetics long before meeting up with him in edu-sig. There is plenty of Bucky on the web to choose from, although much (most) of it is copyrighted. The magnum opus, Synergetics, is available here: http://www.rwgrayprojects.com/synergetics/toc/toc.html While it is copyrighted, many of the ideas in it lend themselves to software expressions. When I was reading it (on traditional pulped trees) I felt like there should be an accompanying set of small programs demonstrating the ideas. Bucky expends a lot of text trying to describe 3D animations in prose. --Dethe

Say, Kirby, speaking of "data rich content", can you suggest any Bucky Fuller-ish or synergetics-type graphics or clip art under a free-as-in-freedom license somewhere on the web I could use to spice this demo up when I move beyond having everything in one file?
Click around in my websites. A lot of what's in synergetics is generic concepts organized in sometimes novel ways, but not owned in the sense the icosahedron is not owned, nor the sphere packing lattice we call the CCP etc. So often what's fun, given Python and a few tools, is to generate one's own images (no permission required). However, if you want already-ready images, there's lots at grunch.net/synergetics. If you have any questions about a specific image, just sent me off-list email. Joe Moore has a lot of images. Plenty of other sources. Fuller School is definitely data rich (huge stash @ Stanford). Kirby

kirby urner wrote:
Say, Kirby, speaking of "data rich content", can you suggest any Bucky Fuller-ish or synergetics-type graphics or clip art under a free-as-in-freedom license somewhere on the web I could use to spice this demo up when I move beyond having everything in one file?
Click around in my websites.
A lot of what's in synergetics is generic concepts organized in sometimes novel ways, but not owned in the sense the icosahedron is not owned, nor the sphere packing lattice we call the CCP etc.
So often what's fun, given Python and a few tools, is to generate one's own images (no permission required).
Excellent idea. And probably what I'll do if I go down that path. Thanks for the suggestion. Now just to get a three-D turtle going again in Python... :-)
However, if you want already-ready images, there's lots at grunch.net/synergetics. If you have any questions about a specific image, just sent me off-list email.
Thanks for the generous offer; I may do that. == a bit of a copyright and licensing rant follows, but it is not meant personally === In practice, though, as a free software developer who dislikes paperwork, it's probably more work to ask (perhaps multiple times as new things are of interest) for permissions than it is worth, compared to, as you suggest, doing my own stuff programmatically (or otherwise). In practice, I almost never think something is worth using enough to ask for permissions; I just don't use it and look for something else, or I even abandon the project. I also would have to trust a longer copyright paper trail. This is not to suggest anything sleazy by anyone, just that it's easy in copyright issues for miscommunication or misinterpretations to arise as to what is permitted when someone other than the original author or copyright holder makes a formal statement. So what one person who knows the author or current owner thinks they can do safely often looks a lot less safe to someone else coming at things second or third hand and relying on indirect representations. Looking for stuff that is already clearly licensed is generally just easier, like at: http://www.openclipart.org/ which links further to various other similar sites: http://www.openclipart.org/wiki/index.php/Similar_Projects [No Bucky stuff I could find though.] Googling on "free clipart" first leads me to commercial sites it seems. :-( Are there any other good free clip art sites people recommend for use for clip art for Python programs? Or perhaps now that I think of it, Wikipedia could be a source: http://en.wikipedia.org/wiki/Buckminster_Fuller There are two GFDL images on the Bucky page. Or more broadly, lots of GFDL images here: http://en.wikipedia.org/wiki/Category:GFDL_images Unfortunately the GFDL license is GPL-incompatible! I've already had words with Richard Stallman about this, :-) but so far I haven't been able to get him to acknowledge how evil GPL-incompatibility is for a content license (especially one the FSF promotes)! :-) From: http://en.wikipedia.org/wiki/GNU_Free_Documentation_License "The GNU FDL is incompatible in both directions with the GPL: that is GNU FDL material cannot be put into GPL code and GPL code cannot be put into a GNU FDL manual." See also: "Why You Shouldn't Use the GNU FDL" http://home.twcny.rr.com/nerode/neroden/fdl.html which suggests, among other things, just using the GPL instead for content. Ironically, this FSF page expounds on the problems with some Creative Commons licenses being GPL-incompatible, http://www.fsf.org/licensing/licenses/ but is strangely silent on GFDL/GPL incompatibility issues which IMHO are similar. I'm hoping that incompatibility gets fixed in later versions of the GPL or GFDL. I've already had it make impossible another project (building GPL-d simulations based on Wikipedia content). I'm not sure yet on Python-license and GFDL content compatibility; I think they might work, except the GFDL would take precedence and might not really give permission to run the combined work? I'm really picky about these licensing issues, because it is so easy to use something incorrectly and recently the formerly civil problem of copyright infringement in the USA has been turned into a criminal offense (felony), and though it is not yet often enforced, it is made easily enforceable by broad interpretations of what it means to make a profit from redistributing something. http://www.usdoj.gov/criminal/cybercrime/CFAleghist.htm See also for example this slashdot article and also my comments here: http://yro.slashdot.org/article.pl?sid=05/11/13/1624200 So, best to stay on the safest ground when possible and when building stuff for others to use, and an example of the "chilling effect" of such copyright legislation on innovation in the USA (or, another "infrastructure" problem :-).
Joe Moore has a lot of images. Plenty of other sources. Fuller School is definitely data rich (huge stash @ Stanford).
Well, in the internet age, perhaps it might have been better to burn it all then give it to a place like Stanford? :-) This is assuming the heirs weren't prudent enough to insist Stanford make the information available for free-as-in-freedom use; perhaps they were? If the heirs had just burned it, and disclaimed copyright interests, then at least what little remained might feel free-er to use when derived from any second-hand source (there would still be issues though, from those sources). Does any small player really want to tangle with the Stanford IP licensing system when it smells money? I already saw Stanford's IP department (in my opinion) damage the Bootstrap Institute effort just as it got started, with a license designed by Stanford for Stanford's own short term benefit. http://www.bootstrap.org/dkr/discussion/1087.html Anyway, I'm sure the Stanford Fuller Archive would be a wonderful place to make a visit to, and spend much time in for inspiration, but I think it unfortunately may have limited relevance to free internet activities otherwise, barring a lot of work by others (hint :-) to get Stanford to allow Fuller's works to be freely used online to make derived works. I guess somehow I just assumed there would be freely licensed works related to Bucky. Too bad Bucky didn't do his work in the open source / free software age. Effectively, even if his work is readable for free-as-in-beer on the web, it is on practical basis then lost as far as being the basis for direct improvements for free-as-in-freedom new works. Still inspiring of course, for indirect use, but it sounds like, barring a lot of permissions work, the actual concrete realizations just need to be ignored when doing something new. :-( Which is a surprise. To be clear, of course there is a lot of Bucky stuff on-line now, to be read exactly as it was written, and I am grateful to people such as yourself for making that happen. I'm sure even that was, and is, a lot of work. What I am talking about is more using it as a springboard to move further by cutting and pasting bits and pieces and reorganizing it into new derived works (like an educational Card stack). This isn't Bucky specific though; this a general problem for making any sort of new educational materials derived from any author's work. It just more ironic and frustrating to see works about building a newer and more prosperous society for everyone tied down by some of the same chains (mostly chains of assumptions :-) trying to keep the rest of society down. Anyway, you made a great suggestion of using the ideas to generate images, and I think that is probably the best way for me to go if I made a HyperCard-like Bucky stack as a demo. And that's much the same reason I'm reinventing Squeak in Python rather than using it as it, with it's licensing issues and licensing heritage. Thanks again for the suggestion, it especially makes sense in a programming context. Each card could be a little mini-simulation with a topic and some code that draws a graphics with a 2D or 3D turtle. Which of course leads me (lazy programmer that I am :-) to the thought, is there any free-as-in-freedom Python source code to draw Bucky structures on the net? :-) --Paul Fernhout

Hi Paul -- I think just apply the basic courtesies of scholarship and make it easy for readers to trace to your sources (provided your sources wish to be known -- I respect the journalist's right to interpose cover, though some over use this as a license to be their own anonymous source (it's often easy to tell as poor writing is poor writing). Your edit/recombine job may be interesting, but many browsers just want to get back to the same stashes you savor. Adequate signage. Unless you're actively keeping secrets, which you may do too, though if at the expense of an unacknowledged starving artist in Hoboken or such like, you're playing with karmic fire. I have many rants against the priority-focused intellectual property discourse of the west. The whole of western civilization rests on sources further east, and now comes around the globe through North America trying to lecture Chinese on what's the matter with copying -- threatening the Chinese with proprietary gunpowders, how ironic. Actually, I'm thinking the USA is already turning away from its broken patent system and embracing the logic of open source, despite all the demonization of same in the economist press. For example Portland, my home town, is safe haven to a new breed of entrepreneurial open source capitalists for example. We have many working business models and plans to expand. Working with Chinese on open source projects is far more productive than trying to accelerate them through a course in US business law, which is super-confusing even to its local practitioners. Nobody has time for all the fine print. It'll just drive ya crazy. So obviously the computer world (which operates quickly) has already found plenty of workarounds. About your prototyping experiments: I'm interested in learning more about Self and would gladly attend lectures on same, at OSCON or wherever. Then I could better follow your logic. My layman's question, after reading your posts, is: why not just keep Python pythonic and Self selfish? In other words, Python already has a very strong paradigm, very simple to learn. Why mess with that? I guess I'm a purist in that regard. But that doesn't make me a language bigot, as I fully respect the right of other languages to define themselves around equally strong, yet different, paradigms. I'm also biased towards OO. That doesn't mean I think the lambda calculus people can't hack their own brands. They already do, and quite successfully by the look of things. Kirby

kirby urner wrote:
About your prototyping experiments: I'm interested in learning more about Self and would gladly attend lectures on same, at OSCON or wherever. Then I could better follow your logic.
Save yourself the airline fair :-) and just watch online: http://video.google.com/videoplay?docid=5776880551404953752 or: http://www.smalltalk.org.br/movies/ At least to see the basics. For textual statement of the key issue, see: http://en.wikipedia.org/wiki/Self_programming_language "The problem: Traditional object languages are based on a deep-rooted duality. Classes define the basic qualities and behaviours of objects, and instances are a particular object based on a class. ... Unless one can predict with certainty what qualities the objects will have in the distant future, one cannot design a class hierarchy properly. All too often the program will evolve to need added behaviours, and suddenly the whole system must be re-designed (or refactored) to break out the objects in a different way. Experience with early OO languages like Smalltalk showed that this sort of issue came up again and again. Systems would tend to grow to a point and then become very rigid, as the basic classes deep below the programmer's code grew to be simply "wrong". [My note: I think Squeak got to that point of rigidity long ago. :-) ] Without some way to easily change the original class, serious problems could arise. ... In general, such changes had to be done very carefully, as other objects based on the same class might be expecting this "wrong" behavior: "wrong" is often dependent on the context. ... The problem here is that there is a duality, classes and instances. Self simply eliminated this duality. ... This may not sound earth shattering, but in fact it greatly simplifies dynamism. If you have to fix a problem in some "base class" because your program has a problem, simply change it and make copies of that new object instead. No other program will see this change. ... This dramatically simplifies the entire OO concept as well. Everything might be an object in traditional system, but there is a very fundamental difference between classes and instances. In Self, there isn't." Now this perhaps paints an overly rosy picture of the solution. In practice, how libraries are written and what tools you use make a big difference too. And Python's use of namespaces make possible management of some of these issues in a different way. Still, I think there remains of core of truth to this, at least as a sense of the overall "tone" of development. You really need to see it in action, like with the video, to really grasp what is possible and what is different about, at the very least, the GUI development paradigm (which in some ways is not that unlike Delphi or VB, but applicable to even regular objects).
My layman's question, after reading your posts, is: why not just keep Python pythonic and Self selfish?
In other words, Python already has a very strong paradigm, very simple to learn. Why mess with that? I guess I'm a purist in that regard. But that doesn't make me a language bigot, as I fully respect the right of other languages to define themselves around equally strong, yet different, paradigms. I'm also biased towards OO.
That doesn't mean I think the lambda calculus people can't hack their own brands. They already do, and quite successfully by the look of things.
This is a good question and a good related comment. In general, you are right of course. And I understand your point about language bigotry. I agree, and I think it is not bigotry to say a language does certain things well, and if you want to do those things well and easily, stick with its mainstream approach. Languages have cultures and paradigms, and you swim against them at your own cost (especially in terms of productivity). Still, I wonder if dominant paradigms in the future will be mainly OCaml (or a derivative) as a better C, and something derived from Self (maybe with Python syntax) as a better Smalltalk. I understood the value of OCaml much better when I stopped thinking of it as a Python or Smalltalk replacement but instead as a C/C++ replacement. I'm not even sure the OCaml community sees themselves that way; they almost certainly don't. :-) Python is stuck in the middle there, and sometimes that is a great place to be (especially at the right time), and sometimes it isn't (especially if times change, in part due to its own success). I feel Python as a community right now as it contemplates its own evolution is torn between types and prototypes. Like Squeak, it has developed its own approach to solving the type problem by having a mini-language in the PyPy project, but, while I think it is a neat thing to try, I am left wondering if perhaps it might be better to just use something like Ocaml for that purpose. As an analogy, I read there was a Basic English that all aircraft pilots and control tower operators were supposed to learn of a hundred or so English word to allow any pilot to land at any airport. Worked well for all the non-native English speakers, but the people who had problems with it were surprisingly the native English speakers, because they has problems remembering what they could say and what they could not. So, perhaps, except for a single purpose of maintaining a VM, a language like Squeak's slang or RPython will never have the value that something like OCaml will have as a general tool. So, back to my point of maybe types and prototypes being dominant paradigms, with types implementing performance critical stuff and prototypes supporting everything else. So, I see this PySqueak work as being a step into that future of Prototypes and types, but I'm working on the Prototype side, not to slight the types side. OCaml and relatives seems to have that nailed, because they are able to infer types, so you get the benefits of types without as much syntactic clutter. And clutter means programs that are harder to read and thus harder to maintain (which is why Java fails as a productive language in practice, even when you have tools to write the boilerplate for you; we really need a tool for Java that hides all that). But, still, to agree with your sentiment, who would want to use a buggy new system that in addition to all of Python's warts (*) http://www.amk.ca/python/writing/warts then adds in on top of that new paradigms and other clutter to learn? And makes difficult to use the old ways, which are well documented and well known? That is always the inventors or innovators' dilemma. There is always a strong constituency for the status quo, and so a new idea has many strong enemies and few half-hearted friends. And I think that is one reason I tend to start over from scratch on language projects; if no one is going to use it, then I might as well write it the way I want it to work then adopt someone else's compromises and legacy choices (which even they may no longer be happy with). On the other hand, progress is often most quickly made by focusing on one specific issue (though not always), so in this case focusing on prototypes gets me more mileage than focusing on syntax (which is the hardest thing for most people to learn it seems). Having agreed somewhat, still, as one inventor (Edison?) was quoted: "What is the value of a new baby?" Python as it is solves whole classes of problems well (text processing, file handling, sockets, gluing C libraries, but lots more). Still, it has weaknesses, especially for entry-level GUI programming. Compare it to HyperCard which met many of CP4E's goals twenty years ago: From: http://en.wikipedia.org/wiki/HyperCard "HyperCard was a huge hit almost instantly. Many people who thought they would never be able to program a computer started using HyperCard for all sorts of automation and prototyping tasks, a surprise even to its creator." Or from: http://www.wired.com/news/mac/0,2125,54365,00.html "The result is both simple and powerful. Fifth-graders get to grips with HyperCard in minutes, building databases of their Pokemon cards. Cyan, the game publisher, used it to create fiendishly complex games like Myst and Riven." Makes me wonder why Python is still struggling with questions of what is the best GUI development tool for it? Or why we think CP4E is needed given Python exists? Is just a better IDLE what it is all about? What is missing? PythonCard is a great step in the right direction. But, I think part of what is missing in Python is the prototype-like notions which are behind HyperCard or Self, or even Delphi or VB GUI development, to a lesser extent. (HyperCard has a English-like language too, but I'll ignore that here, taking a more formal programming language syntax as a given.) Python also fails in practice with modifying programs while they run, which becomes a big issue for intermediate-level programs and their programmers. In this case, here are two big arguments for "Self" and a prototype oriented mindset. One issue it tries to address is something that by now you may take for granted but may still be a stumbling block for new programmers, which is how people usually reason from particular to general. And so they may be better off learning to program a particular object then to program a class of objects. So, in Logo, teaching "Fred" how to draw a spiral may be easier than thinking about a "Fred" class of objects that all draw spirals. You say Python has an easy paradigm to learn, but I think that is not quite true. Python has many inconsistencies and warts (consider the fraction email I just posted) and is approachable to many people in part precisely because they learned C or BASIC first (e.g. infix functional notation), and are looking for something better and more fun and powerful. (**) I feel adding stronger support for Prototypes, if not at the language level then at the tools and library level, would potentially help make Python more approachable for beginners. And I think the Self claims are in that direction. Still, Self is not the only approach to prototypes, see for example: http://www.dekorte.com/docs/protos/ And consider the statement there: "Prototype-based languages are object oriented languages where a new object instance is "cloned" from existing live object (a prototype) instead of being constructed by a class. This makes the language simpler by requiring one less data type and solving the infinite regress problem of classes (if you use a class to create an object, what do you use to create a class?). It is also ideal for systems such as GUIs where the pattern of creating one object by copying and modifying another is already handled by the language itself." The second big area is the "fragile base class problem" or more generally the issue that people want to customize their images or programs, yet that messes up anyone else who want to import the code. By having a culture of working with prototypes, one is more used to customizing specific set ups, and then having tools which help one integrate customizations. One is more used to splitting off a set of base classes and doing different things with them in an experimental ways, then all the careful work that goes into maintaining one common set of classes (Even if ultimately one may prefer to minimized the number of base classes one works with). I think work in that direction is a way to get the benefits of something like Squeak as a platform without all the drawbacks of continually having old code fail with every new small release. Still, in Python's defense (or condemnation :-) in practice it doesn't have a class side equivalent to Smalltalk. You can make one using wrapper functions, and you can use shared class variables, but in general, Python programs don't do things in the class way Smalltalk does. And since much Python code is written in text editors, there is also not the issue of a class side / instance side set of two buttons which every Smalltalk code browser has and can lead to confusion. Python has been creeping more in a class direction, but in practice Python code isn't written that way. So, in that sense, Python is already a lot more Prototype oriented than Smalltalk, plus Python supports instances having their own ad-hoc methods, so again more prototypish. However, even if Python does support this sort of development, in practice, Python examples are written in a more class-oriented way. And Python tools as well. A big part of what I am exploring right now is more how to make a library of Self-like Morph prototypes which can be used to build interfaces in a more Self-like way. Squeak introduced Morphs, but then made them into classes (and badly factored ones at that), which I think lost some of the power. By the way, here is a PataPata (PySqeuak) version that allows wx Widgets as Morphs (with some wrapping, which is only trivially done at this stage as a proof of concept for Buttons, TextEdits, and Sliders), http://svn.sourceforge.net/viewcvs.cgi/patapata/trunk/PataPata/PataPata.py?v... I already have hit a wx bug with the version I use where Buttons don't display drag feedback because they capture mouse drag events when clicked, e.g. the same as probably here: http://groups.google.com/group/comp.soft-sys.wxwindows/browse_thread/thread/121a5068724c613/e077cfa418cbd5d3?lnk=st&q=wxwidgets+button+mouse+events&rnum=7&hl=en#e077cfa418cbd5d3 I can drag wx sliders or text fields correctly though. I saw the same problem when trying the latest version of PythonCard, but now think it is wxGTK not PythonCard. This new version uses a more generalized notion of properties than earlier versions in order to be able to wrap foreign widgets (though not Python's properties, which were too limited in some respects, and also would have not worked in Jython 2.1). Still, with this wx bug (I see no way around it, but that is par for the course when using native widgets or a C library), plus the extra layer of indirectness wrapping in properties requires, I am left wondering if I should just go push ahead with drawing my own widgets, perhaps building on wx OGL or one of the other widget sets people have made like Pyxel. http://bellsouthpwp.net/p/r/prochak/pyxel.html If I do my own widgets, then I could fairly easily make everything work as a Jython-based Java plugin, which would be a neat deployment vehicle. And if I keep the advanced property support, then wx or TK widgets are always a possibility for someone who wants them. --Paul Fernhout (*) and there are several not listed there I could mention, though I still agree with the author there that: "When viewed next to the large number of things which Python gets right -- a small language core, strong but dynamic typing, reuse of the idea of namespaces, always using call by reference [I think he means value], indentation instead of delimiters -- these flaws are small ones, and I can easily live with them". (**) For another example, if you look at it from a documenting-intent point of view, Smalltalk's keyword syntax (e.g. Window showAtX: 10 y: 20." ) makes reading programs much easier than reading functional syntax where arguments have no hints (e.g. "Window.show(10,20)" ) and reading programs is what most programmer's spend most of their time doing. :-) So, if you really want programmers to have happy lives, teach everyone keyword syntax. :-) Of course, I am living with Python's functional syntax in order to gain some of the other benefits of it, but that doesn't mean I am happy about that aspect of it, or that I think it is good for beginners either. Actually, another Python wart is that variable names in a function definition become part of the API (e.g. "myfunction(x=10, y=20)" which is a gross violation of the notion that a function definition, including local variables, should be independent of specifying how it is called.)

On 5/26/06, Paul D. Fernhout <pdfernhout@kurtz-fernhout.com> wrote:
But, still, to agree with your sentiment, who would want to use a buggy new system that in addition to all of Python's warts (*) http://www.amk.ca/python/writing/warts
Quite an old document (2003), many warts fixed.
Still, it has weaknesses, especially for entry-level GUI programming.
I don't regard the "GUI weakness" as inherent in the Python language. One could argue it's stronger at GUIs in the hands of a pro, because able to work with a variety of libraries, vs. one built in solution. Event-based GUI programming is hard to get right period.
Makes me wonder why Python is still struggling with questions of what is the best GUI development tool for it?
Why do we need a "best" one?
Or why we think CP4E is needed given Python exists?
Because languages don't just teach themselves?
Python also fails in practice with modifying programs while they run, which becomes a big issue for intermediate-level programs and their programmers.
You're talking about event loop confusions, e.g. testing a Tk app while running in Tk? Using a text editor, with shell-based reload, allows GUI programs to be debugged on the fly in my experience.
Actually, another Python wart is that variable names in a function definition become part of the API (e.g. "myfunction(x=10, y=20)" which is a gross violation of the notion that a function definition, including local variables, should be independent of specifying how it is called.)
Please explain this notion further. I thought one point of a function definition was to define how to call it i.e. to specify what variables to pass and how. myfunction( ) and myfunction(4,3) would both work in the above case -- Python is quite flexible in this way (not good?). Kirby

Paul D. Fernhout wrote:
For textual statement of the key issue, see: http://en.wikipedia.org/wiki/Self_programming_language "The problem: Traditional object languages are based on a deep-rooted duality. Classes define the basic qualities and behaviours of objects, and instances are a particular object based on a class. ... Unless one can predict with certainty what qualities the objects will have in the distant future, one cannot design a class hierarchy properly. All too often the program will evolve to need added behaviours, and suddenly the whole system must be re-designed (or refactored) to break out the objects in a different way. Experience with early OO languages like Smalltalk showed that this sort of issue came up again and again. Systems would tend to grow to a point and then become very rigid, as the basic classes deep below the programmer's code grew to be simply "wrong". [My note: I think Squeak got to that point of rigidity long ago. :-) ] Without some way to easily change the original class, serious problems could arise. ... In general, such changes had to be done very carefully, as other objects based on the same class might be expecting this "wrong" behavior: "wrong" is often dependent on the context. ... The problem here is that there is a duality, classes and instances. Self simply eliminated this duality. ... This may not sound earth shattering, but in fact it greatly simplifies dynamism. If you have to fix a problem in some "base class" because your program has a problem, simply change it and make copies of that new object instead. No other program will see this change. ... This dramatically simplifies the entire OO concept as well. Everything might be an object in traditional system, but there is a very fundamental difference between classes and instances. In Self, there isn't."
I think there's two issues: classes and inheritance. I don't think inheritance is a particularly respected structure in Python. isinstance() is considered poor form in many situations. Some of Smalltalk's biggest sins, IMHO, come from using inheritance in clever ways, often to a don't-repeat-yourself end, but DRY isn't always such a good idea. The fact that all the classes are opened up to you -- not just technical, but practically actively encouraging you to fiddle with them -- is not very helpful either. So, I think Python avoids some of that fragility, and I think that's just as much about community sentiment as the particular things that are possible. Probably Zope 2 was a big lesson for everyone who touched it in the perils of important and deep class hierarchies. Inheritance is a sometimes useful implementation technique. I think it's wrong to think of it as more than that. Probably we need better support for some other competing techniques. I think that would be most useful if they can be supported without requiring different paradigms. So prototype-based programming and cloning might be easier described in a Python context as delegation (though that's not exactly right). Or whatever it might be -- it doesn't need to be in conflict with what Python has now, because the promises are already pretty loose, and generally Python programmers have no problem with different object models. Every so often you get someone on comp.lang.python who really cares about is-a vs. has-a and all that stuff, but I think that's just a byproduct of poor education and it wears off in time. There's also other ways of doing this stuff out there. Generic functions are a very different approach than either classes or prototypes. And, ignoring some (really fairly minor) syntactic issues, they fit nicely in Python, right alongside other techniques. (Live and persisten objects, however, are not such a good fit into Python, and maybe that's more of what you are pining for?)
My layman's question, after reading your posts, is: why not just keep Python pythonic and Self selfish?
In other words, Python already has a very strong paradigm, very simple to learn. Why mess with that? I guess I'm a purist in that regard. But that doesn't make me a language bigot, as I fully respect the right of other languages to define themselves around equally strong, yet different, paradigms. I'm also biased towards OO.
I don't see the strong paradigm you do. Python has a bunch of under-appreciated functional constructs that people use a lot, but since they don't look Lispy people don't realize they are functional.
Python as it is solves whole classes of problems well (text processing, file handling, sockets, gluing C libraries, but lots more). Still, it has weaknesses, especially for entry-level GUI programming. Compare it to HyperCard which met many of CP4E's goals twenty years ago: From: http://en.wikipedia.org/wiki/HyperCard "HyperCard was a huge hit almost instantly. Many people who thought they would never be able to program a computer started using HyperCard for all sorts of automation and prototyping tasks, a surprise even to its creator." Or from: http://www.wired.com/news/mac/0,2125,54365,00.html "The result is both simple and powerful. Fifth-graders get to grips with HyperCard in minutes, building databases of their Pokemon cards. Cyan, the game publisher, used it to create fiendishly complex games like Myst and Riven."
I really got to get me a copy of HyperCard, but I still can't quite decide what's important about it. Anyway, it's quite possible that HyperCard is best seen as a constrained object model, probably much more constrained than prototype programming in general. Where all your objects are instances of Card, for instance.
In this case, here are two big arguments for "Self" and a prototype oriented mindset. One issue it tries to address is something that by now you may take for granted but may still be a stumbling block for new programmers, which is how people usually reason from particular to general. And so they may be better off learning to program a particular object then to program a class of objects. So, in Logo, teaching "Fred" how to draw a spiral may be easier than thinking about a "Fred" class of objects that all draw spirals.
Why not just use a spiral() function? Specifically here, but also generally. Functions are nice.
The second big area is the "fragile base class problem" or more generally the issue that people want to customize their images or programs, yet that messes up anyone else who want to import the code.
Functions are practically the pinacle of reusability. Of course, with *only* functions it's not that easy to do work, lamda calculus aside. Maybe records (objects with no methods), plus functions, extended with generic functions for smart polymorphism?
(**) For another example, if you look at it from a documenting-intent point of view, Smalltalk's keyword syntax (e.g. Window showAtX: 10 y: 20." ) makes reading programs much easier than reading functional syntax where arguments have no hints (e.g. "Window.show(10,20)" ) and reading programs is what most programmer's spend most of their time doing. :-) So, if you really want programmers to have happy lives, teach everyone keyword syntax. :-) Of course, I am living with Python's functional syntax in order to gain some of the other benefits of it, but that doesn't mean I am happy about that aspect of it, or that I think it is good for beginners either. Actually, another Python wart is that variable names in a function definition become part of the API (e.g. "myfunction(x=10, y=20)" which is a gross violation of the notion that a function definition, including local variables, should be independent of specifying how it is called.)
File layout becomes the module structure too. A little bit of a problem, but not a huge one. I think py3k's extended signature syntax improves this.

On 5/26/06, Ian Bicking <ianb@colorstudy.com> wrote:
I think there's two issues: classes and inheritance. I don't think inheritance is a particularly respected structure in Python.
???
Inheritance is a sometimes useful implementation technique. I think it's wrong to think of it as more than that. Probably we need better support for some other competing techniques.
Composition is a commonly used competing technique. Then of course we're free to just write functions, outside of any class. In my thumbnail of language history, I talk about (1) wild west spaghetti code (2) structured programming (3) OO (4) design patterns. It's very possible to write ultra-confusing OO code, so in some sense 1 is to 2 as 3 is to 4 (structured programming reformed the spaghettifiers, while emphasis on DP corals and controls the undisciplined OOers). I'm sort of leaving out the evolution of functional programming in the above. KIrby

kirby urner wrote:
On 5/26/06, Ian Bicking <ianb@colorstudy.com> wrote:
I think there's two issues: classes and inheritance. I don't think inheritance is a particularly respected structure in Python.
???
Inheritance as an abstraction used for communication. In other words, giving someone a class and telling them to subclass it. Also, paying attention to the class hierarchy when consuming objects is discouraged. Actually, paying any attention to the class when consuming an object is discouraged.
Inheritance is a sometimes useful implementation technique. I think it's wrong to think of it as more than that. Probably we need better support for some other competing techniques.
Composition is a commonly used competing technique. Then of course we're free to just write functions, outside of any class.
Yes, that works great ;)

On 5/26/06, Ian Bicking <ianb@colorstudy.com> wrote:
Inheritance as an abstraction used for communication. In other words, giving someone a class and telling them to subclass it. Also, paying attention to the class hierarchy when consuming objects is discouraged. Actually, paying any attention to the class when consuming an object is discouraged.
Several important standard library modules give you the classes to inherit from, and invite you to write your own methods. HTML and XML parsing for example. I don't see where Python discourages subclassing. Then I think you're talking about duck typing, i.e. if it behaves a certain way, it's OK to pass, and the type checking police won't be strict about it, like in Java. That I understand. However, I wouldn't call that discouraging -- I find it encouraging myself. Too much type checking is a pain, and is what discourages Java use. Kirby

kirby urner wrote:
On 5/26/06, Ian Bicking <ianb@colorstudy.com> wrote:
Inheritance as an abstraction used for communication. In other words, giving someone a class and telling them to subclass it. Also, paying attention to the class hierarchy when consuming objects is discouraged. Actually, paying any attention to the class when consuming an object is discouraged.
Several important standard library modules give you the classes to inherit from, and invite you to write your own methods. HTML and XML parsing for example.
In comparison, ElementTree does such parsing without every subclassing. Those specific modules were inspired by SAX and other libraries that came from the Java or Smalltalk worlds. I think over time we've found better ways to do these things.
I don't see where Python discourages subclassing.
As a language, no. As a practice, *I* certainly would discourage it except when used internally as an implementation detail. Where I do use it, I usually dislike the result. I think this is a shift that's happened many places in the Python world, though I'm sure there are others who would disagree.
Then I think you're talking about duck typing, i.e. if it behaves a certain way, it's OK to pass, and the type checking police won't be strict about it, like in Java. That I understand. However, I wouldn't call that discouraging -- I find it encouraging myself. Too much type checking is a pain, and is what discourages Java use.
When working with objects, yes, you shouldn't care about their class. At which point it's up to the implementor if they subclass an existing class or recreate the functionality. I this recreation is often the right choice, even when it means code duplication. -- Ian Bicking | ianb@colorstudy.com | http://blog.ianbicking.org

As a language, no. As a practice, *I* certainly would discourage it except when used internally as an implementation detail.
I agree that the DP (design pattern) generation is making better sense around when to use and when not to use subclassing. O'Reilly's 'Head First Into Design Patterns' is a good example. I plugged it in my Saturday Academy class, along with TAOCP, NKS, and Zelle's Python intro. Here're some negative comments about this book that I liked, to help us stay well rounded: http://www.codinghorror.com/blog/archives/000380.html Kirby
participants (4)
-
Dethe Elza
-
Ian Bicking
-
kirby urner
-
Paul D. Fernhout