Solved – the difference between AdaBoost.M2, AdaBoost.M1, Gentle AdaBoost, RealAdaboost and the Original Adaboost

boostingclassificationdata mining

I know some questions have been made for example between gentle and ada, but I was wondering about gentle and the others.

For instance, I saw two different works talking about AdaBoost.M1 and AdaBoost.M2 and another work only addressing Gentle, Real and the original Adaboost.

It seemed to me that the Gentle was the latest and best one among gentle, real and original ; and that M2 was superior over M1.

I am not sure whoever what is the difference between (.M1 and .M2) versus the Gentle, Real and Original.

There is a package in R called ada which builds on top of Real, Gentle and the Original Adaboost, but since it does not address neither .M1 nor .M2 I couldn't compare which is one is better.

Best Answer

A Survey of several of the variants can be found in the paper, 'Survey on Boosting Algorithms for Supervised and Semi-supervised Learning,' Artur Ferreira.

Generally, the Original AdaBoost returns the binary valued class that is the ensemble sign result of several combined models.

Real AdaBoost returns a real valued probability of class membership.

The other variants are covered in the paper, but less frequently mentioned in common literature. As I understand it, Gentle Adaboost produces a more stable ensemble model. The AdaBoost.M1 and AdaBoost.M2 models are extensions to multi-class classifications (with M2 overcoming a restriction on the maximum error weights of classifiers from M1). "Experiments with a New Boosting Algorithm," Yoav Freund and Robert E. Schapire.