The sum of feature importances !=1
June 21, 2016
4:25 a.m.
Hi, When I ran the following code: X, y = make_classification(n_samples=100) clf = GradientBoostingClassifier(random_state=0).fit(X, y) imp=clf.feature_importances_ print "The sum of feature importances:", sum(imp) The sum of feature importances is not always equal to 1. So do you have a nice explanation for this situation? Besides, if a tree only contains a root, could we say all its feature importances are 0? I guess the root trees will influence sum of feature importances. Is it right? Best, Enhui
3519
Age (days ago)
3519
Last active (days ago)
0 comments
1 participants
participants (1)
-
Enhui HUANG