[Numpy-discussion] help to speed up the python code

Yakov Keselman yakov.keselman at gmail.com
Fri Oct 31 20:13:24 EDT 2008


My understanding of the root of the problem is that you end up with
doing many evaluations of sinc. If this is so, one suggestion is to go
with pre-computed filters. For example, if you are resampling from 9
points to 10, essentially you're trying to go from a function that is
defined on points 0, 1, 2..., 8 to a function defined on points 0,
0.9, 1.8, 2.7, ... 0.8. If you take point 2.7, it falls in between 2
and 3, and the value for your new function can be approximately
computed as the weighted average: f(2.7) = 0.7*f(3) + 0.3*f(2). This
is equivalent to convolving with a filter [0.3, 0.7]. The only problem
is that the filter will vary from point to point. So, perhaps, you can
pre-compute several filters for each type of new point (there will be
10 point types for this specific case). Your filters would also be
larger in length, so that your interpolation is more precise. Hope
this helps.

On 10/31/08, Charles R Harris <charlesr.harris at gmail.com> wrote:
> On Thu, Oct 30, 2008 at 11:44 PM, frank wang <f.yw at hotmail.com> wrote:
>
>>  Hi, Bob,
>>
>> The problem is that I want to resample my data with another sampling rate.
>> the two rates is very close. I use the formula:
>>
>> s(t)=sum(a_k*sinc(t-kTs)).
>>
>> the new sampling rate is Ts', so I have
>> s(nTs')=sum(a_k*sinc(nTs'-kTs)). The sum index k is over the (-P, P),
>> Centered at n. The n is start from zero. THe code is using two for loops
>> and
>> it is slow. The length of s(nTs) is very long, so it takes quite long time
>> to do it.
>>
>>
>
> I think you can use some form of chirpz/Bluestein for this. Just think of
> the data as the Fourier transform of its spectrum.
>
> Chuck
>


-- 
Not to laugh, not to lament, not to curse, but to understand. -- Spinoza



More information about the NumPy-Discussion mailing list