A classification ensemble is a predictive model composed of a weighted combination of multiple classification models. In general, combining multiple classification models increases predictive performance.
To explore classification ensembles interactively, use the Classification Learner app.
For greater flexibility, use
fitcensemble in the
command-line interface to boost or bag classification trees, or to
grow a random forest .
For details on all supported ensembles, see Ensemble Algorithms. To reduce a multiclass problem
into an ensemble of binary classification problems, train an error-correcting
output codes (ECOC) model. For details, see
|Classification Learner||Train models to classify data using supervised machine learning|
Learn how to train ensembles of classifiers.
Construct ensembles of classifiers in the Classification Learner app.
Create a classification tree ensemble for the ionosphere data set, and use it to predict the classification of a radar return with average measurements.
Learn the methods to obtain a better idea of the quality of an ensemble.
Learn to set prior class probabilities or misclassification costs.
Train an ensemble of classification trees using data containing predictors with many categorical levels.
Automatically choose fewer weak learners for an ensemble in a way that does not diminish predictive performance.
Tune RobustBoost parameters for better predictive accuracy. (RobustBoost requires Optimization Toolbox.)
Gain better predictions when you have missing data using surrogate splits.
Run TreeBagger on sample data.
Improve performance by running TreeBagger in parallel.
Learn how to use a random subspace ensemble to increase the accuracy of classification.
A number of algorithms are available for ensemble learning.