Solved – Feature selection with XGBoost

boostingfeature selectionhyperparameter

XGBoost will produce different values for feature importances with different hyperparameters on the same dataset. When using XGBoost as a feature selection algorithm for a different model, should I therefore optimize the hyperparameters first? Or there are no hard and fast rules, and in practice I should try say both the default and the optimized set of hyperparameters and see what really works?

Best Answer

From comments, Matthew Drury writes:

You shouldn't use xgboost as a feature selection algorithm for a different model. Different models use different features in different ways. Theres no reason to believe features important for one will work in the same way for another.

Related Question