[scikit-learn] GridsearchCV returns worse scoring the broader parameter space gets

Andreas Tosstorff andt88 at hotmail.com
Sun Mar 31 06:15:36 EDT 2019


Dear all,

I am new to scikit learn so please excuse my ignorance. Using GridsearchCV I am trying to optimize a DecisionTreeRegressor. The broader I make the parameter space, the worse the scoring gets.

Setting min_samples_split to range(2,10) gives me a neg_mean_squared_error of -0.04. When setting it to range(2,5) The score is -0.004.

simple_tree =GridSearchCV(tree.DecisionTreeRegressor(random_state=42), n_jobs=4, param_grid={'min_samples_split': range(2, 10)}, scoring='neg_mean_squared_error', cv=10, refit='neg_mean_squared_error')

simple_tree.fit(x_tr,y_tr).score(x_tr,y_tr)


I expect an equal or more positive score for a more extensive grid search compared to the less extensive one.

I would really appreciate your help!

Kind regards,
Andreas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20190331/de2b98ec/attachment.html>


More information about the scikit-learn mailing list