Solved – Chossing an algorithm when there is only one feature

classificationensemble learningfeature selectionpredictive-models

I am combining multiple base classifiers for an ensemble classifier. Different sensors, such as an accelerometer, gyroscope and altimeter are classified individually, and their outputs are then fed into an ensemble classifier.

For accelerometer, 12 features are extracted from specific time windows, and Random Forest is used.
For the altimeter, only one feature is extracted, so I am wondering which algorithm would be best for this?

I know that Random Forest works better when there are many features so I was thinking of using Naive Bayes, or Logistic Regression, but I cannot find any relevant literature to back this up?

Best Answer

Gradient boosting still works well when there are few features, and it excels at finding non-linearities in the data. Logistic regression is only able to find nonlinearities when you manually add polynomial terms, and this manual step is not necessary with gradient boosting. In any case, try out multiple algorithms: gradient boosting, random forest, and logistic regression, and see which performs best on cross validation or a holdout set. Literature can only offer guidelines, not guarantees, so you can only truly know by testing yourself.