[scikit-learn] Why ridge regression can solve multicollinearity?
lampahome
pahome.chen at mirlab.org
Wed Jan 8 21:38:02 EST 2020
Stuart Reynolds <stuart at stuartreynolds.net> 於 2020年1月9日 週四 上午10:33寫道:
> Correlated features typically have the property that they are tending to
> be similarly predictive of the outcome.
>
> L1 and L2 are both a preference for low coefficients.
> If a coefficient can be reduced yet another coefficient maintains similar
> loss, the these regularization methods prefer this solution.
> If you use L1 or L2, you should mean and variance normalize your features.
>
>
You mean LASSO and RIDGE both solve multilinearity?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20200109/3fae8aa7/attachment.html>
More information about the scikit-learn
mailing list