Where mathematics comes from
I recently came across a book by Lakoff & Nunez called "Where mathematics comes from : how the embodied mind brings mathematics into being". After reading the introduction, I jumped to the four appendices where the equation e^(pi)i = -1 is explained. I never thought I'd be able to understand that equation, but reading their explanation made it tantalizingly approachable. It may well be that much of why people were turned off to math in school can be explained by metaphoric deficiencies in the teaching of it. And they show how to rectify the situation with their (metaphor-based) take on how that particular equation might be better explained. Anyway, it'll probably take a few re-readings before I really get it, but at least now, there's a chance... John Miller Technology Services School of Education University of Michigan On Sunday, January 26, 2003, at 12:00 PM, ahimsa <ahimsa@onetel.net.uk> wrote:
That is probably the source of my own aversion to math: I hate to blame my teachers and no doubt I was also probably a lazy sod :) - but whatever it was, I now do have to work really hard to overcome that resistance, despite my appreciation for the sheer beauty of geometry and algebra at the 'safer' and more 'abstract' level of a philosophical appreciation.
At 02:59 PM 1/26/2003 -0500, John Miller wrote:
I recently came across a book by Lakoff & Nunez called "Where mathematics comes from : how the embodied mind brings mathematics into being". After reading the introduction, I jumped to the four appendices where the equation
e^(pi)i = -1
is explained. I never thought I'd be able to understand that equation, but reading their explanation made it tantalizingly approachable. It may well be that much of why people were turned off to math in school can be explained by metaphoric deficiencies in the teaching of it. And they show how to rectify the situation with their (metaphor-based) take on how that particular equation might be better explained. Anyway, it'll probably take a few re-readings before I really get it, but at least now, there's a chance...
John Miller Technology Services School of Education University of Michigan
What kind of explanation do they give I wonder? The approach I've usually seen, and which I think sort of mirrors Euler's process, involves expanding e^x as a polynomial and then showing how the polynomial expansions of sine and cosine sort of merge to give the same thing, or at least of we pass ix as our value (complex number), the signs change such that e^ix = cos(x) + i sin(x). Here's an example of this approach: http://www.nrich.maths.org.uk/mathsf/journalf/aams/q57.html But then you need to go back and figure out why these are valid polynomial expansions for e, cos and sin. We could avoid the proofs for a bit and just play around with the first expansion, comparing outputs on both sides of the equals sign. I.e. in Python we have math.exp(x) for raising e to the xth power. The expansion is (1/0! 1/1! 1/2! 1/3! 1/4!...) where these are the coefficients of a polynomial, with x^0, x^1, x^2, x^3... respectively. One thing we can do in Python is produce successive factorials using a generator, because once we have 4!, why go back and start the multiplications all over just to get 5!. Just multiply 4! (which we already have) by 5 fer gosh sakes. Now I'm using the PyCrust shell:
from __future__ import generators, division from operator import add from math import exp, pi
def fact(): """ Sequence of factorials: 0! 1! 2! 3!... """ n=0; result=1 while 1: if n == 0: yield 1 n += 1 result *= n yield result
[f.next() for i in range(10)] # sequence of factorials [1, 1, 2, 6, 24, 120, 720, 5040, 40320, 362880]
Now we can build the polynomial expansion:
def polye(n,x): """ Polynomial expansion of e^x to n terms, using fact() generator """ f = fact() return reduce(add,[x**i/f.next() for i in range(n)])
and compare exp(x) with polye(n,x), where n is the number of terms we want in our expansion:
exp(3) # e^x ; x = 3 20.085536923187668 polye(25,3) 20.0855369231876
polye(50,10) # e^x ; x = 10 22026.465794806711 exp(10) 22026.465794806718
I notice I needed to push the number of terms pretty high (like to 50) to get a lot of agreement as x increases. Lots to play with here. Perhaps we need an exponentially increasing number of those polynomial terms, to stay on target, as x increases linearly. Now comes the big test (vis-a-vis the topic at hand). Let's make x = i*pi where i = sqrt(-1). Right away, we see that math.exp() fails, because it won't take a complex argument. But polye() passes the test with flying colors.
x = complex(0, pi)
exp(x) Traceback (most recent call last): File "<input>", line 1, in ? TypeError: can't convert complex to float; use e.g. abs(z)
polye(50,x) (-1.0000000000000002+3.4586691443277667e-016j)
Note that, at 50 terms in our expansion, the imaginary component is vanishingly small, and the real part is about as close to -1 as you can get in floating point arithmetic. Kirby
At 01:28 PM 1/26/2003 -0800, I wrote:
One thing we can do in Python is produce successive factorials using a generator, because once we have 4!, why go back and start the multiplications all over just to get 5!. Just multiply 4! (which we already have) by 5 fer gosh sakes.
Of course this argument applies equally well to the fact that we're going x^0, x^1, x^2.... x^n, i.e. we should just hold on to the last powering of x and multiply by x once again. No reason to start over. This suggests a more specialized generator, which does the whole term, in e^x's expansion, not just the factorial part. Also, we can cause the generator to stop iterating after t terms, by passing this stop value as a parameter: (assuming the same import statements as before):
def eterms(x,t): n=0; result=1 if n == 0: yield 1 # moved this line out of the loop while 1: n += 1 if n > t: return # stop iterating else: result *= x/n yield result
def polye2(x,t): return reduce(add,[j for j in eterms(x,t)])
polye2(3.45, 30) # note I reversed parameter order (I like this better) 31.500392308747941
exp(3.45) 31.500392308747937
polye2(1j*pi, 100) # again, *really really close* to -1 (-1+3.3357107625719758e-016j)
Kirby
At 01:28 PM 1/26/2003 -0800, Kirby Urner wrote:
The expansion is (1/0! 1/1! 1/2! 1/3! 1/4!...) where these are the coefficients of a polynomial, with x^0, x^1, x^2, x^3... respectively.
Just for comparison and contrast, in the J language, we get Taylor Expansions of many functions, plus polynomials themselves, implemented as primitive "verbs" (the lingo is resolutely linguistic-grammatical -- nouns, verbs, adverbs, conjunctions, and even gerunds, get defined): (NB. means comment = "nota bene") exp =: ^ NB. ^ in front of a noun is like Python's exp(noun) I just renamed a verb to look more like Python. I'm going to ask for the first 10 coefficients of the polynomial expansion of e^x, in rational format ( 1r2 means 1/2 or one half): taylor =: t. rational =: x: NB. make of 'rational type' (coming in Python) range =: i. NB. like i. 10 returns 0 1 2 3...9 like range(10) exp taylor (range rational 10) NB. <-- user, next line is computer 1 1 1r2 1r6 1r24 1r120 1r720 1r5040 1r40320 1r362880 (i.e. 1/0! 1/1! 1/2! 1/3! 1/4! etc.) coeffs =: exp taylor (range rational 30) NB. same, but more terms coeffs p. 3 NB. use polynomial operator to evaluate p(x), x=3 20.0855 ^3 NB. note, same answer 20.0855 And of course: coeffs p. 0j1p1 NB. 0j1p1 means i pi (like Python, J uses j) _1 And even more exact answer than Python's (thanks to internal rounding I guess). Kirby
At 01:28 PM 1/26/2003 -0800, Kirby Urner wrote:
We could avoid the proofs for a bit and just play around with the first expansion, comparing outputs on both sides of the equals sign. I.e. in Python we have math.exp(x) for raising e to the xth power.
Clearly if we're doing a polynomial expansion equal to the cosine of x, its graph'll have to be the same as cosine's. (x,cos(x)) crosses the x-axis at all these roots (e.g. pi/2, 3pi/2, 5pi/2 etc.). So the polynomial needs another root, and therefore one higher degree, for every new x-axis crossing. The polynomial expansion for cosine is [1/0!, 0, -1/2!, 0, 1/4!, 0, -1/6!...] where these are coefficients for x^0 x^1 x^2 etc. --except the terms with odd exponents have zero coefficients and so go away. Using the generator approach again: def costerms(x,t): n,result = 0,1 maxterm = 2*t if n==0: yield 1 while 1: n += 2 if n >= maxterm: return # bug fixed (>=) -- one too many terms before result *= -x*x/(n*(n-1)) yield result def polycos(x,t): return reduce(add,[i for i in costerms(x,t)])
polycos(3,20) -0.98999249660044553 math.cos(3) -0.98999249660044542
So now I'd like see graphs of these ever higher degree polynomials that criss-cross the x-axis, just like cosine does. In J, I'd just go something like: load 'plot';'trig' polycos =: (cos t. i.20) & p. NB. coefficients out 20 terms, including zeros domain =: (i: 80) % 10 NB. from -8 to 8 in increments of 1/10 plot domain; polycos domain NB. do the graph and I'd be looking at my wavy line on a graph in a window. One approach I've used is to use a module I wrote for outputting to Povray, the ray tracing engine. I just output the (x,y) points as little spheres and link them with cylinders. This gives a graph with a pleasing 3d look -- can be colorful too. However, since I've recently added wxPython to Python 2.2 (and am using the PyCrust that comes with it), tonight my experiment will by to use a package I've not tried before: SciPy (VPython also has graphing capabilities BTW). I've just grabbed the Windows binary from here: http://www.scipy.org/site_content/download_list It seems to install a *lot* of stuff, but then I see I still need Numeric, which the VPython uninstaller recently stole off my machine (as Arthur warned me it would). So I'll go to http://www.pfdubois.com/numpy/ and grab another copy... (Numpy is pretty small, only like 446K -- downloading *very* slowly though, looks like I need to try again... ah, *much* faster from Chapel Hill, NC). OK, now scipy imports without errors. Time to check the docs and see how to do a quick plot... http://www.scipy.org/site_content/tutorials/plot_tutorial Hmmmm... Strangely, when I used Vim to edit code, then reload my module in PyCrust, I sometimes get error messages referring to old code that's gone from the module. I wonder if there's a namespace conflict with scipy somehow. This 'list' not callable error makes no sense anyway -- cut and paste the same code to the prompt and it runs. I notice it's important not to *close* the plt.plot window, if you want to use it again same session. Just change domain and range and replot, without closing the plot window in between. Here's my result: http://www.inetarena.com/~pdx4d/ocn/python/cospoly.png Notice this polynomial crosses the x-axis 10 times! But it's a degree 40 equation no? No. I've got 20 non-zero terms, but that starts from x^0, so I only get up to x^38. 38th degree equation, with 38 roots. A quick check in J shows it has these 10 real roots (where it crosses x), plus 14 conjugate pairs of complex roots (J has a built-in for finding these roots). The real roots are: _14.1262 14.1262 _10.9956 10.9956 7.85398 _7.85398 4.71239 _4.71239 1.5708 _1.5708 which corresponds to plus/minus pi/2 3pi/2 5pi/2 7pi/2 and 9pi/2 -- except these values don't match the pi values. Because this 38 degree poly is an *approximation* (gotta have infinite terms to be spot on). The only Python functions I added, to create a domain and range for plt.plot(domain,range), were: def mkdomain(x): # kinda ugly, but it works x *= 10 return map(lambda x: x*0.1, range(-x,0,1)) + map(lambda x: x*0.1, range(x+1)) def mkrange(d): return [cospoly(i,20) for i in d] # the 20 fixes the number of terms So this graph was a result of going:
from scipy import plt import plotplay # the generator, cospoly, and the above two functions [1] d = plotplay.mkdomain(16) # from -16 to 16 in 0.1 increments r = plotplay.mkrange (d) # feed this domain to cospoly plt.plot(d,r)
and that's it! Kirby [1] recall that we need to import some stuff in plotplay: from __future__ import generators, division from operator import add
participants (2)
-
John Miller
-
Kirby Urner