AdaBoost |
AdaBoost (adaptive boosting) is an ensemble learning algorithm that can be used for classification or regression. Although AdaBoost is more resistant to overfitting than many machine learning algorithms, it is often sensitive to noisy data and outliers.
AdaBoost is called adaptive because it uses multiple iterations to generate a single strong learner. AdaBoost creates the strong learner (a classifier that is well-correlated to the true classifier) by iteratively adding weak learners (a classifier that is only slightly correlated to the true classifier). During each round of training, a new weak learner is added to the ensemble and a weighting vector is adjusted to focus on examples that were misclassified in previous rounds.
You can perform adaptive boosting in MATLAB with Statistics Toolbox, which includes the following algorithms:
See also: machine learning
Data Driven Fitting with MATLAB 36:26 (Webinar)