[SciPy-user] linear regression

josef.pktd at gmail.com josef.pktd at gmail.com
Wed May 27 12:19:35 EDT 2009


On Wed, May 27, 2009 at 12:08 PM, Skipper Seabold <jsseabold at gmail.com> wrote:
> On Wed, May 27, 2009 at 11:59 AM, Skipper Seabold <jsseabold at gmail.com> wrote:
>> On Wed, May 27, 2009 at 11:54 AM, ms <devicerandom at gmail.com> wrote:
>>> josef.pktd at gmail.com ha scritto:
>>>> On Wed, May 27, 2009 at 10:05 AM, ms <devicerandom at gmail.com> wrote:
>>>>> jason-sage at creativetrax.com ha scritto:
>>>>>> Is there a recommended way now of calculating the slope of a linear
>>>>>> regression?  Using the scipy.stats.linregress function gives a
>>>>>> deprecation warning, apparently because that function uses the
>>>>>> scipy.mean function:
>>>>> I think you can use polyfit for doing linear regression, isn't it?
>>>>
>>>> but you don't get the slope coefficient and the standard errors, if
>>>> you want more than just prediction.
>>>
>>> You mean the correlation coefficient? This is numpy.corrcoef() or
>>> something like that.
>>
>> He means that polyfit does not provide the Betas in a linear fit of,
>> for example, y = Beta * x + Beta2 * x**2 and their associated standard
>> errors.  It will only give you the predictions (ie., Y-hats) for your
>> data based on the fit.
>
> Err, sorry I don't think this isn't right for polyfit after having a
> look.  One day I will learn to look before I leap...
>
> Have a look here <http://www.scipy.org/Cookbook/LinearRegression>

y = Beta0 + Beta1 * x + Beta2 * x**2   is the second order polynomial.

I also should have looked, polyfit returns the polynomial coefficients
but doesn't calculate the variance-covariance matrix or standard
errors of the OLS estimate.

Josef



More information about the SciPy-User mailing list