This demo gives a clear visual presentation of what happens during the Adaboost algorithms. It shows how the decision boundary, example weights, training error and base learner weights change during training.
A selection of base learning algorithms are included: Linear Regression, Naive Bayes, Decision Stump, CART (requires stats toolbox), Neural Network (requires netlab) and SVM (requires libsvm). There are also 3 dataset generators (2-gaussians, circle and rotated checkerboard). There is documentation to assist with adding custom base learner algorithms or dataset generators.
The demo allows the choice of base learner and dataset. It is then possible to add one base learner at a time, according to the Adaboost algorithm.
After any number of base learners, the decision boundary and margins are shown on the plot. It is also possible to view two graphs: Error rates (showing how Adaboost affects training and generalisation errors as more base learners are added), and margin distributions (showing the cumulative distribution of margins for the current ensemble).
Base learners appear in a list at the left of the window. These include a checkbox which disables/enables each learner, and a scroll bar that adjusts its weight. This makes it possible to see the consequences of changing the weights assigned by Adaboost.
The Reset button enables all the base learners and sets their weights according to Adaboost. The checkboxes can be right-clicked to disable all other learners and view the impact of only the selected base learner.