Solved – the difference between Recursive Feature Elimination and Backward Pass Feature Elimination

feature selectionmachine learningpythonrandom forestscikit learn

In the case of the Backward Pass you eliminate the features starting with the ones that have the smallest Pearson correlation with the predicted parameter. You keep the parameter if after its elimination the score goes down and let it go otherwise.

However, it is not clear for me what happens in the RFE. Let's say you have N parameters to predict the target parameter. Then, do you check all of the N-1 combinations of them? and then extract the one that is less significant for the prediction? and then remove it and move to N-2 number of parameters and repeat? (the description doesn't indicate that) I do not understand the word recursive in the name…

Best Answer

It does exactly what you described. See also: 'recursive feature elimination (RFE) is to select features by recursively considering smaller and smaller sets of features [...] That procedure is recursively repeated on the pruned set until the desired number of features to select is eventually reached.' https://scikit-learn.org/stable/modules/feature_selection.html#rfe 1.13.3