HI Friedrich & All, On 29 March 2010 23:44, Friedrich Romstedt wrote:

2010/3/29 Andrea Gavana <andrea.gavana@gmail.com>:

If anyone is interested in a follow up, I have tried a time-based interpolation of my oil profile (and gas and gas injection profiles) using those 40 interpolators (and even more, up to 400, one every month of fluid flow simulation time step).

I wasn't expecting too much out of it, but when the interpolated profiles came out (for different combinations of input parameters) I felt like being on the wrong side of the Lala River in the valley of Areyoukidding. The results are striking. I get an impressive agreement between this interpolated proxy model and the real simulations, whether I use existing combinations of parameters or new ones (i.e., I create the interpolation and then run the real fluid flow simulation, comparing the outcomes).

I'm reasoning about the implications of this observation to our understanding of your interpolation. As Christopher pointed out, it's very important to know how many gas injections wells are to be weighted the same as one year.

When you have nice results using 40 Rbfs for each time instant, this procedure means that the values for one time instant will not be influenced by adjacent-year data. I.e., you would probably get the same result using a norm extraordinary blowing up the time coordinate. To make it clear in code, when the time is your first coordinate, and you have three other coordinates, the *norm* would be:

def norm(x1, x2): return numpy.sqrt((((x1 - x2) * [1e3, 1, 1]) ** 2).sum())

In this case, the epsilon should be fixed, to avoid the influence of the changing distances on the epsilon determination inside of Rbf, which would spoil the whole thing.

I have an idea how to tune your model: Take, say, the half or three thirds of your simulation data as interpolation database, and try to reproduce the remaining part. I have some ideas how to tune using this in practice.

This is a very good idea indeed: I am actually running out of test cases (it takes a while to run a simulation, and I need to do it every time I try a new combination of parameters to check if the interpolation is good enough or rubbish). I'll give it a go tomorrow at work and I'll report back (even if I get very bad results :-D ).

As an aside, I got my colleagues reservoir engineers playfully complaining that it's time for them to pack their stuff and go home as this interpolator is doing all the job for us; obviously, this misses the point that it took 4 years to build such a comprehensive bunch of simulations which now allows us to somewhat "predict" a possible production profile in advance.

:-) :-)

I wrapped everything up in a wxPython GUI with some Matplotlib graphs, and everyone seems happy. Not only your collegues! The only small complain I have is that I wasn't able to come up with a vector implementation of RBFs, so it can be pretty slow to build and interpolate 400 RBFs for each property (3 of them).

Haven't you spoken about 40 Rbfs for the time alone??

Yes, sorry about the confusion: depending on which "time-step" I choose to compare the interpolation with the real simulation, I can have 40 RBFs (1 every year of simulation) or more than 400 (one every month of simulation, not all the monthly data are available for all the simulations I have).

Something completely different: Are you going to do more simulations?

110% surely undeniably yes. The little interpolation tool I have is just a proof-of-concept and a little helper for us to have an initial grasp of how the production profiles might look like before actually running the real simulation. Something like a toy to play with (if you can call "play" actually working on a reservoir simulation...). There is no possible substitute for the reservoir simulator itself. Andrea. "Imagination Is The Only Weapon In The War Against Reality." http://xoomer.alice.it/infinity77/ ==> Never *EVER* use RemovalGroup for your house removal. You'll regret it forever. http://thedoomedcity.blogspot.com/2010/03/removal-group-nightmare.html <==