[SciPy-dev] code for incremental least squares

Nathaniel Smith njs at pobox.com
Wed Feb 17 12:35:48 EST 2010


On Wed, Feb 17, 2010 at 7:10 AM,  <josef.pktd at gmail.com> wrote:
> In the description of some estimators and the optimization routines
> used, I have read that the updating is done on the cholesky or QR
> decomposition of the inverse matrix ( (x'x)^{-1} or inverse Hessian).
> Except for simple least squares problem the inverse is often more
> important than the the original X'X.
>
> Are there algorithms available for this, and is there an advantage for
> either way? From what I have seen in incremental_ls, the updating
> works on QR of x'x and then the inverse of the QR decomposition is
> taken. Assuming a well behaved non-singular X'X.

I'm not sure what you mean here -- when using QR for OLS, you take the
QR decomposition of X, not X'X (this is what gives the improved
numerical stability -- if X is ill-conditioned, then X'X is
ill-conditioned squared, so you want to avoid it). See:
  http://en.wikipedia.org/wiki/Linear_least_squares#Orthogonal_decomposition_methods
You do need to explicitly calculate inv(X'X) at the end in order to
get standard error estimates, and it's nice if you can re-use any
decompositions you have lying around to make this faster (for the QR
case, inv(X'X) = inv(R'Q'QR) = inv(R'R), which can take advantage of
R's triangular structure).

But I'm not sure what you're referring to when you say "some
estimators", or why those routines want to incrementally compute the
inverse Hessian, so I can't help you there.

-- Nathaniel



More information about the SciPy-Dev mailing list