On March 12 of this year, I participate as a middle school judge for the Northwest Science Expo held on the campus of Portland State University, in Portland Oregon. I was assigned to the team evaluating middle school behavioral science projects along with 8 other scientists and engineers. We evaluated 21 student projects. This was a great group of kids and I want to be clear that this is in no way a criticism of any of their work. I believe that the judges took much effort to value and encourage the students. But as is often the case, a world view can be shattered by an example of something outstanding, which then shows all else to be mediocre. I had one of those experiences at NWSE, and it relates to the use of computers in education. I'd be interested in any comments anyone in this group has to offer. The outstanding example in this case was a project put together by young lady in the fourth grade (who was, by the way competing against 6-8th graders). Her project investigated the effects of varying light/dark cycle periods on bean plant growth. Her experimental design and investigation was reasonably thorough, particularly for someone her age, but not outstanding by comparison to those she was competing with. What struck me as outstanding however, was how she related to her data. When I asked what conclusions she drew from her experiment, she made an expansive hand motion, one of those "it should be clearly obvious to the most casual observer," gestures towards her hand drawn data plots. With wide eyed excitement, she explained how her data clearly proved her original thesis incorrect. It was clear that she had fully internalized her data. In this, she stood apart from here peers. By contrast, most other student I spoke with related to their graphs and charts as fuzzy abstract representations of their data. Some examples: * More than half of the students produced three-dimensional bar charts to display data. Unfortunately, only one student was able to accurately read their bar charts. The others failed to account for the parallax between the bar and the back of the graph which represented the X scale. * One young man was analyzing the effect of color on mood. But when he reported his results, his graph was color coded for color. In other words, his red bar represented the color black, the pink bar represented yellow, the yellow bar represented green. When asked about this, he said his software wouldn't let him pick the colors of the bars in his bar-chart. * An 8th grader noted that her results did not vary significantly from what would have been predicted from a purely random sampling. This was a very enlightened analysis, and the judges all gave her credit for attempting to utilize good statistical analysis. But the statistical theory she was attempting to utilize was clearly well beyond her knowledge and skill level. When asked why so much of her her project display was focused on the statistical analysis of her data when she admitted that she didn't really understand it, she stated, "well, Excel did all the math for me." * A sixth grader did a project to determine which parent the gene for red hair was inherited from. Her knowledge of genetics was impressive, particularly for her age. But the computer generated graphs she displayed conveyed no meaning to any of the judges. Had she displayed and focused on the genealogy charts she had stashed in the back of her project notebook, she probably would have won an award. * There were two students who had a very good project on luminescent solutions and the effects of solution temperature on luminescence. While their presentation was very good, what I found most notable in speaking with them was the rate of change graph they had scribbled on the back of a piece of yellow legal pad to explain the cascade effect they had observed. Their scribbled graph conveyed more meaning than all their beautiful computer generated charts. Questions to ponder: -------------------- * Does the use of the computer to generate charts enhance the synthesis of data, or might it actually hinder it? In this case, the fourth grader who was determining environmental effects on plant growth collected her data by drawing points on her graph each day during the experiment. At the science fair, she was able to articulate the meaning of that data better than other middle school students I spoke with. The only other students who showed equivalent understanding did so by utilizing a scribbled graph drawn on the back of a legal pad. So is there a correlation between the manual plotting of data and the cognitive interpretation of that data? * Gerald E. Jones in "How to LIE with CHARTS," on the "old fassion" approach to charting mused: Having such crude tools might have forced those early chart-makers into slower thought processes. It is conceivable that they actually pondered carefully composition--maybe even the content!--of those pathetically simple charts and graphs. Can it be that in their technological poverty they achieved a higher level of consciousness? Did they actually come to grasp the meaning of their graphic creations? [1] With the proliferation of three-dimensional bar that even the creators can not accurately read, is the real meaning of the data being masked by the visual presentation? * I know from experience that I can take someone who is generally computer phobic but who has a good understanding of statistics, sit them down in front of Excel, and within a few hours they can be using it productively. On the other hand, as I observed, an 8th grader who knows Excel inside and out, cannot do meaningful statistics. So why do so many people insist that schools teach Excel, or Word, or Powerpoint for example? Shouldn't the real focus be on teaching number theory, problem solving, language composition and cognitive perception? I'm beginning to wonder how much benefit computers actually add to the learning process? I'd welcome a discussion on the topic by those who are effectively utilizing computers in their curriculum today. Thanks for the bandwidth. Tom O. ----- Thomas O'Connor toconnor@vcd.hp.com Hewlett Packard, Vancouver Washington Phone: (360) 212-5031 Telnet: 212-5031 [1] "How to LIE with CHARTS," Gerald E. Jones, Sybex, 1995, p. XVII. See also: "The Visual Display of Quantative Information," Edward Tufte, Graphics Press, 1983 ----- Thomas O'Connor toconnor@vcd.hp.com Hewlett Packard, Vancouver Washington Phone: (360) 212-5031 Telnet: 212-5031
So why do so many people insist that schools teach Excel, or Word, or Powerpoint for example? Shouldn't the real focus be on teaching number theory, problem solving, language composition and cognitive perception?
I'm beginning to wonder how much benefit computers actually add to the learning process? I'd welcome a discussion on the topic by those who are effectively utilizing computers in their curriculum today.
Thanks for the bandwidth.
Tom O.
This is a good analysis and "connect the dots" presentation. I might draw different conclusions though, e.g. the ineffective use of computer technology points to a need for more, not less, training it its use. Perhaps CP4E offers a better approach, in that it gets students more deeply under the hood of their computers, vs. having them rely so entirely on "wizard" style apps which "do everything for you". They learn to drive with a manual shift first. Tufte's (which you mention) is of course the classic in this field. He takes a no-nonsense approach and advocates eliminating "chart junk" -- lots of extra noise and clutter which maybe appears "slick" to the aesthetically un-initiated, but which actually gets in the way of communicating the data. Given computers, there's a need to bring this "plain speech" aesthetic into the classroom, as an antidote to all the bells and whistles which students are tempted to include simply because they can. I think cutting out the computer training leaves students more helpless than ever, went suddenly plunked down in front of Microsoft Office XXXX and told to produce "presentation graphics" for the boss. If only they'd had a good teacher in high school who (a) showed them the ins and outs and (b) laid down some sensible aesthetic guidelines. Kirby
Thomas Thank you for an interesting post.. My first reaction is that your ancdotes powerfully reinforce why people here, and elsewhere, have been articulating so strongly in favor of a new 'computer literacy' as opposed to computer use. Use of MSExcel by itself will lead to a dumbnig down effect perhaps as many times as it has the oppposite. Microsoft's philosophy in general is "we do it all for you - sit back and munch popcorn in yuor lazey boy!".. thisis why so programmers adn artists many fear an mistrust them.. The same may be true for any packaged tool which comes without guidance, experience or understanding.. And it is participative experience, the essence of an active and emersive quality in life which is the crucial difference. 'Experience' these days is a word which has been wholesale drugged, gagged, beaten and kidpnapped by marketeers. Making things, doing thinsg with one's own mind, eye, hand brain.. experimenting, making mistakes, getting lost, finding one's way, trying again, succeeding .. that is experience. I argue: #1 It does NOT matter what tools one uses - sticks and sand, pencil and paper, python and laptop. It's how you use them that counts.. [pun by accident] You said it yourself - she had internalized the data and the insight.. Had the kids who used Excel, instead been taught the foundation of programatic thinking using 'hands-on' CLI and visualization tools like Python _before_ having being turned loose on MSExcell with its blasckbox, counter-intuitive, packaged magic charting 'wizards', they might even have surprised you even more! Would you ask them to run a 3.x minute mile before they can walk? [-> see #2 below] #2 It DOES matter what tools one uses. It's how you use them that counts.. Tools which are open..let one explore, tools whihc let one build one's own tools and use them how one wants.. These are good tools to use for learning. Computers are among the most adaptible tools ever invented, because they are tool building tools. One can thus grow with them and adapt them to ones learning. [-> see#1 above] If people [schools] put computers and dependancy upon them at the focus, and then ask everyone to adapt to them then a terrible mistake and disservice is being made. However, if children and learning are the center, and we teach how to use and adapt these soft tools [computers] to each other's needs [programming] then I believe we will see a flowering of human expression and intelligence whcih will make the renaissance look like kindergarden!! - Jason ___________________________________________________________ Jason CUNLIFFE = NOMADICS['Interactive Art and Technology'] ----- Original Message ----- From: "Thomas O'Connor" <toconnor@vcd.hp.com> To: <edu-sig@python.org> Cc: <toconnor@hpvcpto.vcd.hp.com> Sent: Thursday, April 19, 2001 6:39 PM Subject: [Edu-sig] Observations from the Northwest Science Expo
On March 12 of this year, I participate as a middle school judge for the Northwest Science Expo held on the campus of Portland State University, in Portland Oregon. I was assigned to the team evaluating middle school behavioral science projects along with 8 other scientists and engineers. We evaluated 21 student projects.
...snip interesting annecdotes and concerned questions...
I'll have to go against Kirby here (sort of). I think this is a sign that computers have become fetish items which distract from teaching the basics. Let me use the metaphor of a calculator. A calculator can easily help you add, subtract, multiply, and divide, and helps to reduces errors with these operations. But when you try to do anything more advanced, such as algebra, the calculator doesn't really help unless you actually understand what you are doing. The difference with the computer (especially with the wizard-driven Excel graphs) is that the computer can still *look* like it's doing something for you even when you don't understand you're doing. I think we need to teach computers in school, but a) we cannot let them become fetishes, b) we cannot let them detract from the essentials of what is being taught, and c) we need to use computers for what they're good for (and that's a tough call). --Dethe Dethe Elza Chief Mad Scientist Burning Tiger Technologies
At 06:38 PM 4/19/2001 -0700, Dethe Elza wrote:
I'll have to go against Kirby here (sort of). I think this is a sign that computers have become fetish items which distract from teaching the basics.
It's just that I think learning *some* programming needs to be one of those basics. In my version of TomorrowLand, you really are no-kidding illiterate in a fundamental way if you have no programming skills whatsoever. Likewise, to know some programming is nothing to boast about (any more than knowing your alphabet) -- it's just one of those things, like riding a bike (doesn't mean you don't feel proud at first, but this is no way to stand out from the crowd -- this is all a logical consequence of what CP4E really means, IF we take it seriously). So let's talk about the 4 R's: Reading, wRiting, 'Rithmetic and pRogramming. Or, to keep it 3, just make sure 'Rithmetic is understood to include some of that command line stuff. Or conflate it with wRiting if you prefer. Any way you slice it, this is basic, fundamental, critical stuff. To not teach it when the technology is freely available is trully idiotic (just to be polemical about it). That being said, sure, maybe draw lots of your charts and graphs by hand and leave the Excel wizards out of it. Programming doesn't mean making more work than you need to. When I want to put a political cartoon on the web, I freehand it (with a pen, not a mouse or lightpen). No need for 3DStudio Max or Adobe Illustrator. But then I scan and FTP it -- finding the right _mix_ of technologies (paper and pencils are technology too, as are quill pens) is what's important. Kirby
On Thu, 19 Apr 2001, Dethe Elza wrote:
Let me use the metaphor of a calculator. A calculator can easily help you add, subtract, multiply, and divide, and helps to reduces errors with these operations. But when you try to do anything more advanced, such as algebra, the calculator doesn't really help unless you actually understand what you are doing.
Educator's must be vigilant against using technology as a substitute for hard thinking. I've seen too many high school students use their calculators to divide a number by 10. Don't even get me started on proportional reasoning... -Tim -- Tim Wilson | Visit Sibley online: | Check out: Henry Sibley HS | http://www.isd197.k12.mn.us/ | http://www.zope.org/ W. St. Paul, MN | | http://slashdot.org/ wilson@visi.com | <dtml-var pithy_quote> | http://linux.com/
Wow! I have been away for a few days, and there are follow-on comments I haven't read yet as I clean up a 1000-posting Python backlog, winnowed to the 100 that I intend to read. This is awesome. First, I learned statistics in college, and yet it wasn't until, many years later where I took on being a statistician for a 29-week training project in which people managed their progress and achievement of their results through statistics that my relationship changed from a fun game with mathematics and numbers to an experienced measure of something that was happening in the world. I affirm that the *mechanical* use of computers is not appropriate in education or *anywhere* *else*. I tutored high-school algebra last summer, and I am putting together a team of 10 to provide more for the same school (the one my entire family attended) using fellow alumni from the class of '57. One of the things that I noticed is that the use of calculators and decimal notation leaves kids completely at sea when we get to rationals again as part of algebra (where decimal simplifications are unavailable). I don't mean to disparage the use of calculators. What I see is missing is that intimate relationship with numbers and arithmetic in which we begin to develop a sense of how these things work and can be confident in the manipulations and relationships that the algebra over standard arithmetic then abstracts and makes systematic. I tutored my 11-year-old grandnephew, a new 6th grader earlier this school year, and I saw how all he wanted to do was get an answer the teacher would accept. There was nothing about how we learned to deal with our fallibility as arithmeticians, being able to demonstrate our results to others and, indeed, being able to coach a fellow student. (I recently noticed that this is all in Polya's "How to Solve It", but when I examined it as a college freshman I didn't get it.) Getting back to the key point. Having an intimate relationship with the data and having a way to visualize the data's story born from that familiarity is crucial. When I worked for engineers at Boeing as a 19-year-old engineering aide, it was amazing to see their sense of the data and the automatic way their attention was drawn to things that looked off and needed to be explored more deeply. There was usually an error in the calculations, in some input data, or the rough model we were perfecting. I had the same experience working with Paul Horst, a serious statistician at the University of Washington at that time. Don't let go on what you observed. It is critical and much to be cultivated if we are to encourage the future engineers, managers, and leaders who we are going to entrust with shepherding the technologies of the future. The biggest concern I have is the lack of experience teachers may have with this themselves and how they are being thrown into a world of computer use without any way to get grounded in where the power really is. It does not empower them to teach and it does not give them a sense of the wonder and power of scientific approaches. More than anything, there is a missing compassion for human fallibility and the use of tools that mask that rather than give us mastery of it. Thank you for your thoughtfulness and careful observation. -- Dennis -----Original Message----- From: edu-sig-admin@python.org [mailto:edu-sig-admin@python.org]On Behalf Of Thomas O'Connor Sent: Thursday, April 19, 2001 15:39 To: edu-sig@python.org Cc: toconnor@hpvcpto.vcd.hp.com Subject: [Edu-sig] Observations from the Northwest Science Expo On March 12 of this year, I participate as a middle school judge for the Northwest Science Expo held on the campus of Portland State University, in Portland Oregon. I was assigned to the team evaluating middle school behavioral science projects along with 8 other scientists and engineers. We evaluated 21 student projects. This was a great group of kids and I want to be clear that this is in no way a criticism of any of their work. I believe that the judges took much effort to value and encourage the students. But as is often the case, a world view can be shattered by an example of something outstanding, which then shows all else to be mediocre. I had one of those experiences at NWSE, and it relates to the use of computers in education. I'd be interested in any comments anyone in this group has to offer. The outstanding example in this case was a project put together by young lady in the fourth grade (who was, by the way competing against 6-8th graders). Her project investigated the effects of varying light/dark cycle periods on bean plant growth. Her experimental design and investigation was reasonably thorough, particularly for someone her age, but not outstanding by comparison to those she was competing with. What struck me as outstanding however, was how she related to her data. When I asked what conclusions she drew from her experiment, she made an expansive hand motion, one of those "it should be clearly obvious to the most casual observer," gestures towards her hand drawn data plots. With wide eyed excitement, she explained how her data clearly proved her original thesis incorrect. It was clear that she had fully internalized her data. In this, she stood apart from here peers. By contrast, most other student I spoke with related to their graphs and charts as fuzzy abstract representations of their data. Some examples: * More than half of the students produced three-dimensional bar charts to display data. Unfortunately, only one student was able to accurately read their bar charts. The others failed to account for the parallax between the bar and the back of the graph which represented the X scale. * One young man was analyzing the effect of color on mood. But when he reported his results, his graph was color coded for color. In other words, his red bar represented the color black, the pink bar represented yellow, the yellow bar represented green. When asked about this, he said his software wouldn't let him pick the colors of the bars in his bar-chart. * An 8th grader noted that her results did not vary significantly from what would have been predicted from a purely random sampling. This was a very enlightened analysis, and the judges all gave her credit for attempting to utilize good statistical analysis. But the statistical theory she was attempting to utilize was clearly well beyond her knowledge and skill level. When asked why so much of her her project display was focused on the statistical analysis of her data when she admitted that she didn't really understand it, she stated, "well, Excel did all the math for me." * A sixth grader did a project to determine which parent the gene for red hair was inherited from. Her knowledge of genetics was impressive, particularly for her age. But the computer generated graphs she displayed conveyed no meaning to any of the judges. Had she displayed and focused on the genealogy charts she had stashed in the back of her project notebook, she probably would have won an award. * There were two students who had a very good project on luminescent solutions and the effects of solution temperature on luminescence. While their presentation was very good, what I found most notable in speaking with them was the rate of change graph they had scribbled on the back of a piece of yellow legal pad to explain the cascade effect they had observed. Their scribbled graph conveyed more meaning than all their beautiful computer generated charts. Questions to ponder: -------------------- * Does the use of the computer to generate charts enhance the synthesis of data, or might it actually hinder it? In this case, the fourth grader who was determining environmental effects on plant growth collected her data by drawing points on her graph each day during the experiment. At the science fair, she was able to articulate the meaning of that data better than other middle school students I spoke with. The only other students who showed equivalent understanding did so by utilizing a scribbled graph drawn on the back of a legal pad. So is there a correlation between the manual plotting of data and the cognitive interpretation of that data? * Gerald E. Jones in "How to LIE with CHARTS," on the "old fassion" approach to charting mused: Having such crude tools might have forced those early chart-makers into slower thought processes. It is conceivable that they actually pondered carefully composition--maybe even the content!--of those pathetically simple charts and graphs. Can it be that in their technological poverty they achieved a higher level of consciousness? Did they actually come to grasp the meaning of their graphic creations? [1] With the proliferation of three-dimensional bar that even the creators can not accurately read, is the real meaning of the data being masked by the visual presentation? * I know from experience that I can take someone who is generally computer phobic but who has a good understanding of statistics, sit them down in front of Excel, and within a few hours they can be using it productively. On the other hand, as I observed, an 8th grader who knows Excel inside and out, cannot do meaningful statistics. So why do so many people insist that schools teach Excel, or Word, or Powerpoint for example? Shouldn't the real focus be on teaching number theory, problem solving, language composition and cognitive perception? I'm beginning to wonder how much benefit computers actually add to the learning process? I'd welcome a discussion on the topic by those who are effectively utilizing computers in their curriculum today. Thanks for the bandwidth. Tom O. ----- Thomas O'Connor toconnor@vcd.hp.com Hewlett Packard, Vancouver Washington Phone: (360) 212-5031 Telnet: 212-5031 [1] "How to LIE with CHARTS," Gerald E. Jones, Sybex, 1995, p. XVII. See also: "The Visual Display of Quantative Information," Edward Tufte, Graphics Press, 1983 ----- Thomas O'Connor toconnor@vcd.hp.com Hewlett Packard, Vancouver Washington Phone: (360) 212-5031 Telnet: 212-5031 _______________________________________________ Edu-sig mailing list Edu-sig@python.org http://mail.python.org/mailman/listinfo/edu-sig
I like to imagine a fictional but possible world in which computer programming for everybody (CP4E) means smoother integration of math and computer science in the lower grades. In the real world, there's a push to make the teaching of algebra more widespread at the 8th grade level. What I imagine is that this is the grade level where we might formally begin using some programming language (e.g. Python, Scheme) in the classroom -- in the sense of having students look at and write source code. In 7th grade, they might watch the teacher project some command line stuff, but getting under the hood wouldn't happen quite as much. A link between programming and algebra is in this concept of variables. A polynomial in the 2nd degree is typically written as Ax^2 + Bx + C, which in Python is more like A*x**2 + B*x + C. The capital letters are called constant coefficients and they "fix" a polynomial, give it its characteristic "call letters" (like in radio: the is KQPD...). Then x is what varies -- you might stipulate over some domain, e.g. from in [-10,10] (square brackets means inclusive). At the command line, 8th graders would have one kind of function called a "polynomial factory" that turned out polynomials with a specific set of coefficients. These would then be floated as functions in their own right, ready to take in x domain values and spit out f(x) range values. There may be a better way to write the factory function than I've shown below. I'd like to see other solutions:
def makepoly(A,B,C): """ Build a polynomial function from coefficients """ return eval("lambda x: %s*x**2 + %s*x + %s" % (A,B,C))
f = makepoly(2,3,4) # pass coefficients as arguments f(10) # f is now a function of x 234 2*10**2 + 3*10 + 4 # check 234 f(-10) 174 [f(x) for x in range(-10,11)] # remember, 2nd arg is non-inclusive [174, 139, 108, 81, 58, 39, 24, 13, 6, 3, 4, 9, 18, 31, 48, 69, 94, 123, 156, 193, 234] g = makepoly(1,-2,-7) # make a new polynomial g(5) 8 g(f(5)) # composition of functions 4616 f(g(5)) # f(g(x)) is not equal to g(f(x)) 156
The same technique might be applied to a sinewave function. Here the coefficients may appear is follows: f(x) = A sin(Bx + C) + D You can imagine doing the factory function, based on the above example, or using some (better?) strategy. Below is a web resource that implements some of these same ideas, also using a computer language, but to my eye it all looks less intuitive in the J language (what the text says is featured). Is it just me? Is this just because I've spent more time with Python? I'm sure that's partly it. http://www.jsoftware.com/pubs/mftl/mftl.htm Moving beyond 8th grade, we want students to understand what's meant be D(f(x)) at point x, i.e. dy/dx at x -- the derivative. Again, Python makes this easy in that we can write a generic derivative taker:
def deriv(f,x): """ Return approximate value of dy/dx at f(x) """ h = .0001 return (f(x+h)-f(x))/h
deriv(f,2) # f(x) = 2*x**2 + 3*x + 4 11.000200000026439 deriv(g,2) # g(x) = x**2 - 2*x - 7 2.0001000000036129
If you remember how to take the derivative of a polynomial, you'll know that f'(x) = 4*x + 3 and g'(x) = 2*x - 2 -- so these are pretty good approximations. Somebody challenged us on math-teach to think of how we might teach crypto and number theory to 7th graders. My suggestions are in this thread (Number Theory question, reply to Wayne Bishop, 25 Apr 2001 http://www.mathforum.com/epigone/math-teach/zilbangkan ). I use the IDLE CLI quite a bit, along with my ciphers.py Another post re Python + Intro Calculus, as per the above is at: http://www.mathforum.com/epigone/math-learn/wonbloryeh I think any of us who know some Python, and some math, can write this kind of stuff. It's not esoteric. I'd just like to see more of it, is all, coming from a lot of different corners. That's because I think a CP4E world would have some major advantages over the current one, including more students motivated to comprehend technology at deeper levels (I think adding more computer stuff to math will make it less of a turn off for many students, if done in a cool and interesting way (which doesn't have to mean super glitzy at every turn -- a bare bones CLI ala IDLE can be part of it)). Kirby
Kirby Urner writes:
A link between programming and algebra is in this concept of variables. A polynomial in the 2nd degree is typically written as Ax^2 + Bx + C, which in Python is more like A*x**2 + B*x + C. The capital letters are called constant coefficients and they "fix" a polynomial, give it its characteristic "call letters" (like in radio: the is KQPD...). Then x is what varies -- you might stipulate over some domain, e.g. from in [-10,10] (square brackets means inclusive).
This point is very subtle. I'm afraid that a lot of adults still couldn't tell -- for example -- the difference between an expression like "x+3" and an equation like "y=x+3". Mathematicians tend to get very comfortable with higher-order functions, "function factories", operations on functions, and the like. Unfortunately, it seems that other people often don't. If we see in a math book Consider the equation y = Ax + B there's already a good deal of sophistication: first, "y = Ax + B" is _not_ a general fact (or theorem) which is true in general, without context or qualification. In this case it's a hypothesis, or, some would say, a statement of a subworld in which this _is_ an axiom or a theorem (one of many possible such subworlds contained as logical possibilities within our own world). (This might be confusing because _some_ formulas, like those in physics or like the quadratic formula, _are_ general -- and are something to memorize and to apply in a whole class of situations. Whereas here, this equation is not a great truth about nature, but just an assumption which we make temporarily to see what would follow from it.) But also, A and B were not specified explicitly. So we're told that "A and B are numbers" and "x and y are numbers" -- but they have different roles with respect to the situation. A, B, x, and y are _all_ unspecified, but A and B are "constants" and x and y are "variables": yet later on we may make a further hypothesis. "Suppose x is 3 -- then what happens?" (One kind of answer is "Then y will be 3A + B".) Isn't it funny, from the point of view of a beginning algebra student, that a "constant", which is supposed to represent a _particular number_, is still often given in symbolic form, and we often never learn what the constant actually stood for? That leads to yet another conceptual problem: _do_ variables always actually "stand for" some particular quantity? At the very beginning of algebra, as I experienced it, the answer was an unqualified "yes": each letter is just a sort of "alias" or "code name" or something for a _particular quantity_, and it is our job to find that particular quantity and so "solve the problem". This completely glosses over the possibility of underdetermined equations, which simply describe relationships between quantities (some mathematicians like to think of functions, or mappings between sets). If we say y=f(x) -- without giving other simultaneous equations -- we have an underdetermined system, and there is no longer a meaningful question of "What is x?". x is not anything in particular; x is really a signifier for the entire domain of a function. But, interestingly, if we add more equations to a system, it may become determined. In that case, we are asking "What are the particular values of the variables x, y, z for which all of these things could be true at the same time?" or "If all of these constraints were applied at once, what possibilities would be left?" or "What is the intersection of the geometric objects corresponding to the loci of points satisfying each of these relations?". In the geometric interpretation, "y = Ax^2 + Bx + C" is actually an equation representing a shape in five-dimensional shape, one equation in five unknowns. When we study a particular quadratic, we are intersecting that shape with the shapes given by three other equations: A = [some given constant], B = [some given constant], and C = [some given constant]. Then, by substitution, we could turn this, if we choose, into one equation in two unknowns (the familiar quadratic). But some people would prefer to say that all five dimensions are still there -- we are just looking at a particular region, where some other conditions specific to our problem happen to obtain. The geometric interpretation of any equation (or inequality or other kind of relation) as expressing something about a subset of a space with a huge number of dimensions (one per variable) is something that shows up in explicit detail in linear algebra, but before that is only hinted at. Some Algebra II textbooks mention a bit about "so many equations in so many unknowns", and substitution, and maybe even determinants, but there isn't the strong geometric sense of "there are an infinite number of dimensions of space, and by writing mathematical relations we choose to focus our attention on intersections or disjunctions or whatever of subsets of that space". And I don't know whether the multidimensional spatial metaphor is helpful or harmful in 7th grade; if people have read E. A. Abbott, it will at least be _exciting_ to them. Once I studied Scheme and lambda and environments in _SICP_, I felt much more comfortable about all of this. Here programming can help a great deal, I think. But I wonder how many algebra students can't really see what's going on and what the actual roles of those letters are.
At the command line, 8th graders would have one kind of function called a "polynomial factory" that turned out polynomials with a specific set of coefficients. These would then be floated as functions in their own right, ready to take in x domain values and spit out f(x) range values.
There may be a better way to write the factory function than I've shown below. I'd like to see other solutions:
def makepoly(A,B,C): """ Build a polynomial function from coefficients """ return eval("lambda x: %s*x**2 + %s*x + %s" % (A,B,C))
If Python's variable scope rules didn't prevent it, return lambda x: A*x**2 + B*x + C would be much easier to read, because it would avoid the format string stuff, which is _not_ so intuitive unless you're already a C programmer. Earlier today I wrote "%02x" without flinching, so "%s" is second nature. But if you're trying to get students to understand how the lambda is working, "%s" and the tuple may add a lot of confusion which it would be nice to be able to avoid.
f = makepoly(2,3,4) # pass coefficients as arguments f(10) # f is now a function of x 234 2*10**2 + 3*10 + 4 # check 234 f(-10) 174 [f(x) for x in range(-10,11)] # remember, 2nd arg is non-inclusive [174, 139, 108, 81, 58, 39, 24, 13, 6, 3, 4, 9, 18, 31, 48, 69, 94, 123, 156, 193, 234] g = makepoly(1,-2,-7) # make a new polynomial g(5) 8 g(f(5)) # composition of functions 4616 f(g(5)) # f(g(x)) is not equal to g(f(x)) 156
I still think that some of my old math teachers would be floored if they could see some of these applications.
Moving beyond 8th grade, we want students to understand what's meant be D(f(x)) at point x, i.e. dy/dx at x -- the derivative. Again, Python makes this easy in that we can write a generic derivative taker:
def deriv(f,x): """ Return approximate value of dy/dx at f(x) """ h = .0001 return (f(x+h)-f(x))/h
deriv(f,2) # f(x) = 2*x**2 + 3*x + 4 11.000200000026439 deriv(g,2) # g(x) = x**2 - 2*x - 7 2.0001000000036129
If you remember how to take the derivative of a polynomial, you'll know that f'(x) = 4*x + 3 and g'(x) = 2*x - 2 -- so these are pretty good approximations.
I would add an h parameter, with a default value: def deriv(f,x,h=0.0001): return (f(x+h)-f(x))/h Another possibility is to take the right-limit numerical derivative and the left-limit numerical derivative. def left_deriv(f,x,h=0.0001): return (f(x)-f(x-h))/h Contrasting the two shows (among other things) that the numerical methods are imprecise and the kind of errors you get may depend on where you take your samples. (The magnitude of the error will depend on the magnitude of h, but the sign of the error will depend on which side you take the derivative on.) I think I would have been happy to have been given some numerical methods before I knew calculus, and then to learn a precise way to do it. (If I had programs that say that the slope of something is 2.0000001, I would like to know how to prove that my programs aren't quite right, and that the real slope is 2.) Python is a good language for this; I have written concise Riemann sum and Monte Carlo programs which find the area under a circle, to illustrate for students how those techniques work. My Riemann sum program is 17 lines and my Monte Carlo program is 51 lines; I think they're readily readable for a non-programmer. The current situation is definitely the reverse -- you get symbolic differentiation and integration first, and numerical methods on a computer later on. In fact, because I wasn't going into engineering, I never had the numerical methods course, and I still don't know about a lot of the efficient ways for doing numerical integration of differential equations and so on. I wonder whether the "symbolic first, numerical later" comes from the historical lack of availability of computers, or whether it's intended to promote detailed understanding. I know a lot of math teachers were very unhappy with graphing calculators because of the extent to which their approximations can substitute for proof. You would see algebra tests where a teacher would say "solve this system" or "solve this polynomial" and students would pull out graphing calculators and get an answer like "2.003" and write "x=2". So one obvious problem is that they weren't solving anything -- they were just typing the equations into the graphing calculator, and it would find intersections or roots by iterative approximation methods. Another problem is that students were seeing an answer that looked plausible and believing it without any proof. So if x "looks like about 2" (in the calculator), students might have little compunction about writing "x=2". But this could be wrong! It's easy to devise problems where a numerical solution is _close to_ an integer or a rational, so if you do an approximation you can easily be misled. Sometimes there is a symbolic method which would give a precise answer in symbolic form (and show that the guess from the approximation is actually incorrect). On the other side, people who do work with applied mathematics nowadays often spend a lot of time writing programs to do numerical approximation -- numerical integration and differentiation, numerical approximate solution of linear systems, numerical approximation of roots of polynomials, and dynamic or statistical simulation. I met lots of experimental physicists who did simulation work and who weren't trying to find any kind of symbolic form for anything -- they wanted _numbers_! Crunch, crunch, crunch. So some people might say that, if a good deal of mathematics is done (as it is) with computer approximation and simulation, then getting people started early with experimenting with computer approximation and simulation is a good thing. The question is whether there is necessarily a loss in understanding of theory when people are taught these numerical techniques early on. (Who wants to do symbolic integration to figure a definite integral when your TI calculator has FnInt?) Of course, you can program symbolic manipulation in a computer too. A very good exercise which some students at my old high school once got a lot of attention for attacking is symbolic differentiation -- given a string or a tree representation of a function of one variable, compute a string or a tree representation of its symbolic derivative. The representation just has to be _valid_; you don't have to simplify. This also leads to the interesting question of how you get a string converted into a tree or a tree converted into a string, which is something people are _definitely_ going to encounter in great detail in computer science. Finding the abstract structure of a formal expression, which is to say parsing it, just keeps coming up everywhere. I could have understood a lot about that in high school, if somebody had taught it. -- Seth David Schoen <schoen@loyalty.org> | And do not say, I will study when I Temp. http://www.loyalty.org/~schoen/ | have leisure; for perhaps you will down: http://www.loyalty.org/ (CAF) | not have leisure. -- Pirke Avot 2:5
Lot's of interesting ruminations here.
disjunctions or whatever of subsets of that space". And I don't know whether the multidimensional spatial metaphor is helpful or harmful in 7th grade; if people have read E. A. Abbott, it will at least be _exciting_ to them.
( I'm somewhat impatient with this approach of using "higher dimensions" as hype (pun intended) -- getting math to seem "cool" by doing stage magic ala Abbott. But I'm very much in the minority here. )
Once I studied Scheme and lambda and environments in _SICP_, I felt much more comfortable about all of this. Here programming can help a great deal, I think. But I wonder how many algebra students can't really see what's going on and what the actual roles of those letters are.
Yes, good point. I think it all seems clearer once you've got the computer language to refer to. Just studying the language clears up operating on functions, passing functions as parameters. Python makes this all more concrete, easier to wrap your brain around.
If Python's variable scope rules didn't prevent it,
return lambda x: A*x**2 + B*x + C
I think there's something more like this I'm just not getting tonight. The scoping rules did change recently. Surely there's a better way than my eval("%s") stuff, I agree.
I would add an h parameter, with a default value:
def deriv(f,x,h=0.0001): return (f(x+h)-f(x))/h
Yeah, good suggestion. There's also a way to compute an optimized h based on analysis of the function itself, but I don't have that at my finger tips.
I could have understood a lot about that in high school, if somebody had taught it.
You make a lot of on-target observations in this post, ask some good questions. Given different learners have different strengths and weaknesses, what's the best mix of approaches and methods is maybe not something we can optimize in a blanket way. But the computer language and command line capability adds a lot more tools and ways to get at the substance of math. It's a whole new bag of tricks (relatively new). I think teachers will eventually have a lot of fun with this stuff -- but right now calculators are all the rage, are hogging the limelight. Given all the stuff you can do with computers, kids want them (MP3s, web etc.). But calculators are pretty specialized and relatively few would go out and buy them were they not required for various math courses. So you spend a lot of time learning to use a tool that you may not use beyond the math course itself (if you're going to be using math professionally, you'll likely switch to a computer). With a computer language, though, you've got something to grow with. It'll engage the operating system, serve as a way to control various other apps via APIs and so forth. I think this is a better and more versatile platform on which to build in a lot of mathematical concepts, vs. being so reliant on calculators. Kirby
participants (7)
-
Dennis E. Hamilton
-
Dethe Elza
-
Jason Cunliffe
-
Kirby Urner
-
Seth David Schoen
-
Thomas O'Connor
-
Timothy Wilson