fitensemble adaboost m1 stump picking

4 views (last 30 days)
sedar sedar
sedar sedar on 13 Sep 2012
Commented: Reda on 21 May 2014
I am running fitensemble with adaboostm1 When I view the first tree both right and left decisions are class 1. I do have two classes in my data. I think this has no effect. Why would the algorithm pick such a stump or tree? Decision tree for classification 1 if x3<70.5 then node 2 elseif x3>=70.5 then node 3 else 1 2 class = 1 3 class = 1

Answers (1)

Ilya
Ilya on 13 Sep 2012
A decision tree in a boosting ensemble by default minimizes the Gini diversity index, not classification error. Two child nodes originating from the same parent can be dominated by the same class. This situation is not uncommon.
  1 Comment
Reda
Reda on 21 May 2014
Hello, I noticed the same problem as sedar sedar. When adaboosting a classification tree, the learners are all slumps. This is acounter-intuitive, specially that fitting a classification tree with the same parameters gives a much deeper tree. So why is the first learner not as deep ?

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!