[SciPy-user] nonlinear fit with non uniform error?

Robert Kern robert.kern at gmail.com
Thu Jun 21 13:09:12 EDT 2007


massimo sandal wrote:
> Matthieu Brucher ha scritto:
>>     1)Does this mean that least squares is NOT ok?
>>
>> Yes, LS is _NOT_ OK because it assumes that the distribution (with its
>> parameters) is the same for all errors. I don't remember exactly, but
>> this may be due to ergodicity
> 
> OK. I just wanted to be sure I understood.

However, weighted least squares works just fine.

>>     2)What does "rescaling" mean in this context?
>>
>> You must change B and C so that :
>> Ay +/- 5
>> B'y +/- 5
>> C'y +/- 5
> 
> Huh? How can this be possible/make sense whatsoever?

I think the notation was misunderstood. Let's start from scratch, at least
notationally. You have a function

  y = f(b, x)

where `b` is the parameter vector, `x` is a vector of input points, and `y` is
the vector of outputs corresponding to those inputs. Now, you have data
consisting of vectors x0 and y0. According to the model, we have random
variables Y0[i] which are normally distributed about f(b, x0[i]) each with their
own variance v[i]. Equivalently, we can say that the residuals

  R[i] ~ N(0, v[i])

Now, to solve this problem with leastsq() we need to rescale the *residuals*
such that their corresponding random variables all have the same variance.

  def residuals(b, x0=x0, y0=y0, v=v):
      return (y0 - f(b, x0)) / sqrt(v)

Does this make sense?

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth."
  -- Umberto Eco



More information about the SciPy-User mailing list