Solved – How does gbm for classification work

boostingclassificationloss-functionsresiduals

I have a got a fair idea about how it works in regression where each successive decision tree tries to predict the residual (negative gradient for loss function) and the predicted value gets added to the result of the previous tree.
Can someone please explain how this works in case of classification? What is the residual in this case?

Best Answer

In fact, there are not too much difference between regression and classification. The only difference is the loss function. In regression, the model is trying to minimize e.g., RSME. In classification, the model is trying to minimize the logistic loss.

Details can be found here.

Regularization methods for logistic regression

Related Question