[scikit-learn] LogisticRegression coef_ greater than n_features?

Sebastian Raschka mail at sebastianraschka.com
Mon Jan 7 23:54:50 EST 2019


Maybe check 

a) if the actual labels of the training examples don't start at 0
b) if you have gaps, e.g,. if your unique training labels are 0, 1, 4, ..., 23

Best,
Sebastian

> On Jan 7, 2019, at 10:50 PM, pisymbol <pisymbol at gmail.com> wrote:
> 
> According to the doc (0.20.2) the coef_ variables are suppose to be shape (1, n_features) for binary classification. Well I created a Pipeline and performed a GridSearchCV to create a LogisticRegresion model that does fairly well. However, when I want to rank feature importance I noticed that my coefs_ for my best_estimator_ has 24 entries while my training data has 22.
> 
> What am I missing? How could coef_ > n_features?
> 
> -aps
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn



More information about the scikit-learn mailing list