[SciPy-User] orthonormal polynomials (cont.)

josef.pktd at gmail.com josef.pktd at gmail.com
Thu Jun 16 11:29:36 EDT 2011


.......... <still working>

example with Emerson recursion

laguerre for weight function gamma pdf with shape=0.5 instead of shape=1

>>> mnc = moments_gamma(np.arange(21), 0.5, 1)
>>> myp = orthonormalpoly_moments(mnc, 10, scale=1)
>>> innerprod = dop.inner_cont(myp, 0., 100, stats.gamma(0.5).pdf)[0]
>>> np.max(np.abs(innerprod - np.eye(innerprod.shape[0])))
5.1360515840315202e-10
>>> for pi in myp: print pi.coeffs
...
[ 1.]
[ 1.41421356 -0.70710678]
[ 0.81649658 -2.44948974  0.61237244]
[ 0.2981424  -2.23606798  3.35410197 -0.55901699]
[ 0.07968191 -1.1155467   4.18330013 -4.18330013  0.52291252]
[ 0.01679842 -0.37796447  2.64575131 -6.61437828  4.96078371 -0.49607837]
[  2.92422976e-03  -9.64995819e-02   1.08562030e+00  -5.06622805e+00
   9.49917760e+00  -5.69950656e+00   4.74958880e-01]
[  4.33516662e-04  -1.97250081e-02   3.25462634e-01  -2.44096975e+00
   8.54339413e+00  -1.28150912e+01   6.40754560e+00  -4.57681829e-01]
[  5.59667604e-05  -3.35800562e-03   7.63946279e-02  -8.40340907e-01
   4.72691760e+00  -1.32353693e+01   1.65442116e+01  -7.09037640e+00
   4.43148525e-01]
[  6.39881348e-06  -4.89509231e-04   1.46852769e-02  -2.22726700e-01
   1.83749528e+00  -8.26872874e+00   1.92937004e+01  -2.06718219e+01
   7.75193320e+00  -4.30662955e-01]

I haven't had time to figure out QR yet.

Josef

On Wed, Jun 15, 2011 at 11:13 AM, Christopher Jordan-Squire
<cjordan1 at uw.edu> wrote:
> If you google around, there are numerically stable versions of
> Gram-Schmidt/QR facotorization and they're quite simple to implement. You
> just have to be slightly careful and not code it up the way it's taught in a
> first linear algebra course. However, the numerical instability can show up
> even for small numbers of basis vectors; the issue isn't the number of basis
> vectors but whether they're approximately linearly dependent.
>
> -Chris JS
>
> On Tue, Jun 14, 2011 at 7:18 PM, Charles R Harris
> <charlesr.harris at gmail.com> wrote:
>>
>>
>> On Tue, Jun 14, 2011 at 3:57 PM, <josef.pktd at gmail.com> wrote:
>>>
>>> On Tue, Jun 14, 2011 at 5:48 PM, Charles R Harris
>>> <charlesr.harris at gmail.com> wrote:
>>> >
>>> >
>>> > On Tue, Jun 14, 2011 at 3:15 PM, Charles R Harris
>>> > <charlesr.harris at gmail.com> wrote:
>>> >>
>>> >>
>>> >> On Tue, Jun 14, 2011 at 3:01 PM, <josef.pktd at gmail.com> wrote:
>>> >>>
>>> >>> On Tue, Jun 14, 2011 at 4:40 PM, Charles R Harris
>>> >>> <charlesr.harris at gmail.com> wrote:
>>> >>> >
>>> >>> >
>>> >>> > On Tue, Jun 14, 2011 at 12:10 PM, nicky van foreest
>>> >>> > <vanforeest at gmail.com>
>>> >>> > wrote:
>>> >>> >>
>>> >>> >> Hi,
>>> >>> >>
>>> >>> >> Without understanding the details... I recall from numerical
>>> >>> >> recipes
>>> >>> >> in C that Gram Schmidt is a very risky recipe. I don't know
>>> >>> >> whether
>>> >>> >> this advice also pertains to fitting polynomials, however,
>>> >>>
>>> >>> I read some warnings about the stability of Gram Schmidt, but the
>>> >>> idea
>>> >>> is that if I can choose the right weight function, then we should
>>> >>> need
>>> >>> only a few polynomials. So, I guess in that case numerical stability
>>> >>> wouldn't be very relevant.
>>> >>>
>>> >>> >>
>>> >>> >> Nicky
>>> >>> >>
>>> >>> >> On 14 June 2011 18:58,  <josef.pktd at gmail.com> wrote:
>>> >>> >> > (I'm continuing the story with orthogonal polynomial density
>>> >>> >> > estimation, and found a nice new paper
>>> >>> >> >
>>> >>> >> > http://www.informaworld.com/smpp/content~db=all~content=a933669464 )
>>> >>> >> >
>>> >>> >> > Last time I managed to get orthonormal polynomials out of scipy
>>> >>> >> > with
>>> >>> >> > weight 1, and it worked well for density estimation.
>>> >>> >> >
>>> >>> >> > Now, I would like to construct my own orthonormal polynomials
>>> >>> >> > for
>>> >>> >> > arbitrary weights. (The weights represent a base density around
>>> >>> >> > which
>>> >>> >> > we make the polynomial expansion).
>>> >>> >> >
>>> >>> >> > The reference refers to Gram-Schmidt or Emmerson recurrence.
>>> >>> >> >
>>> >>> >> > Is there a reasonably easy way to get the polynomial
>>> >>> >> > coefficients
>>> >>> >> > for
>>> >>> >> > this with numscipython?
>>> >>> >> >
>>> >>> >
>>> >>> > What do you mean by 'polynomial'? If you want the values of a set
>>> >>> > of
>>> >>> > polynomials orthonormal on a given set of points, you want the 'q'
>>> >>> > in a
>>> >>> > qr
>>> >>> > factorization of a (row) weighted Vandermonde matrix.  However, I
>>> >>> > would
>>> >>> > suggest using a weighted chebvander instead for numerical
>>> >>> > stability.
>>> >>>
>>> >>> Following your suggestion last time to use QR, I had figured out how
>>> >>> to get the orthonormal basis for a given set of points.
>>> >>> Now, I would like to get the functional version (not just for a given
>>> >>> set of points), that is an orthonormal polynomial basis like Hermite,
>>> >>> Legendre, Laguerre and Jacobi, only for any kind of weight function,
>>> >>> where the weight function is chosen depending on the data.
>>> >>>
>>> >>
>>> >> But in what basis? The columns of the inverse of 'R' in QR will give
>>> >> you
>>> >> the orthonormal polynomials as series in whatever basis you used for
>>> >> the
>>> >> columns of the pseudo-Vandermonde matrix.
>>>
>>> with basis you mean hear for example the power series  (x**i,
>>> i=0,1,..)  that's what they use, but there is also a reference to
>>> using fourier polynomials which I haven't looked at for this case.
>>>
>>> >>
>>> >
>>> > Example.
>>> >
>>> > In [1]: from numpy.polynomial.polynomial import polyvander
>>> >
>>> > In [2]: v = polyvander(linspace(-1, 1, 1000), 3)
>>> >
>>> > In [3]: around(inv(qr(v, mode='r'))*sqrt(1000./2), 5)
>>> > Out[3]:
>>> > array([[-0.70711,  0.     ,  0.79057, -0.     ],
>>> >        [ 0.     ,  1.22352, -0.     , -2.80345],
>>> >        [ 0.     ,  0.     , -2.36697,  0.     ],
>>> >        [ 0.     ,  0.     ,  0.     ,  4.66309]])
>>> >
>>> > The columns are approx. the coefficients of the normalized Legendre
>>> > functions as a power series up to a sign.
>>>
>>> Looks interesting. It will take me a while to figure out what this
>>> does, but I think I get the idea.
>>>
>>> Thanks,
>>>
>>
>> The normalization factor comes from integrating the columns of q, i.e.,
>> \int p^2 ~= dt*\sum_i (q_{ij})^2 = 2/1000. I really should have weighted the
>> first and last rows of the Vandermonde matrix by 1/sqrt(2) so that the
>> integration was trapazoidal, but you get the idea.
>>
>> Chuck
>>
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User at scipy.org
>> http://mail.scipy.org/mailman/listinfo/scipy-user
>>
>
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>
>



More information about the SciPy-User mailing list