[Numpy-discussion] Interpolation question
josef.pktd at gmail.com
josef.pktd at gmail.com
Sat Mar 27 21:14:08 EDT 2010
On Sat, Mar 27, 2010 at 8:24 PM, Andrea Gavana <andrea.gavana at gmail.com> wrote:
> Hi All,
> I have an interpolation problem and I am having some difficulties
> in tackling it. I hope I can explain myself clearly enough.
> Basically, I have a whole bunch of 3D fluid flow simulations (close to
> 1000), and they are a result of different combinations of parameters.
> I was planning to use the Radial Basis Functions in scipy, but for the
> moment let's assume, to simplify things, that I am dealing only with
> one parameter (x). In 1000 simulations, this parameter x has 1000
> values, obviously. The problem is, the outcome of every single
> simulation is a vector of oil production over time (let's say 40
> values per simulation, one per year), and I would like to be able to
> interpolate my x parameter (1000 values) against all the simulations
> (1000x40) and get an approximating function that, given another x
> parameter (of size 1x1) will give me back an interpolated production
> profile (of size 1x40).
> Something along these lines:
> import numpy as np
> from scipy.interpolate import Rbf
> # x.shape = (1000, 1)
> # y.shape = (1000, 40)
> rbf = Rbf(x, y)
> # New result with xi.shape = (1, 1) --> fi.shape = (1, 40)
> fi = rbf(xi)
> Does anyone have a suggestion on how I could implement this? Sorry if
> it sounds confused... Please feel free to correct any wrong
> assumptions I have made, or to propose other approaches if you think
> RBFs are not suitable for this kind of problems.
if I understand correctly then you have a function (oil production y)
over time, observed at 40 time periods (t), and this function is
parameterized by a single parameter x.
y = f(t,x)
t = np.arange(40)
len(x) = 1000
blowing up t and x you would have a grid of 1000*40 with 40000
observations in t,x and y
rbf = Rbf(t, x, y) might theoretically work, however I think the
arrays are too large, the full distance matrix would be 40000*40000
I don't know if bivariate splines would be able to handle this size.
How much noise do you have in the simulations?
If there is not much noise, and you are mainly interested in
interpolating for x, then I would interpolate pointwise for each of
the 40 t
y = f(x) for each t, which has 1000 observations. This might still
be too much for RBF but some of the other univariate interpolators
should be able to handle it. I would try this first, because it is the
easiest to do.
This could be extended to interpolating in a second stage for t if necessary.
If there is a lot of noise, some interpolation local in t (and maybe
x) might be able to handle it, and combine information across t.
An alternative would be to model y as a function of t
semi-parametrically, e.g. with polynomials, and interpolate the
coefficients of the polynomial across x, or try some of the
interpolators that are designed for images, which might have a better
chance for the array size, but there I don't know anything.
Just some thoughts,
> Thank you in advance for your suggestions.
> "Imagination Is The Only Weapon In The War Against Reality."
> ==> Never *EVER* use RemovalGroup for your house removal. You'll
> regret it forever.
> http://thedoomedcity.blogspot.com/2010/03/removal-group-nightmare.html <==
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
More information about the NumPy-Discussion