Solved – How to use AdaBoost weights in Decision Tree

boostingcartmachine learning

I've implement c4.5 and CART, and i want compare them in my implementation of Boosting (AdaBoost). I don't understand how to "connect" weights, that im setting in AdaBoost, to decision tree algorithms.
At the begining i have Data set that i split for trening and testing sets. I set weights for trening set and build trees, then update weights, but what next? Should i add some weight parameter to gain ratio and gini index of trees? Or i should create traning set by sampling with or without replacment and use only part of Training set to build next tree?

Best Answer

AdaBoost for classification is essentially to build multiple weak learners based on re-weighted samples and the final classifier is weighted combination of all the weak learners.

So for each iteration of building one weak classifier, there are two kinds of weights to be updated, weights for all the samples and weight for this single weak learner, which is simply calculated based on the error rate of this weak learner. Hope this helps you.

Related Question