A regression tree ensemble is a predictive model composed of
a weighted combination of multiple regression trees. In general,
combining multiple regression trees increases predictive performance.
To boost regression trees using LSBoost, use
To bag regression trees or to grow a random forest , use
To implement quantile regression using a bag of regression trees,
For classification ensembles, such as boosted or bagged classification trees, random subspace ensembles, or error-correcting output codes (ECOC) models for multiclass classification, see Classification Ensembles.
|Regression Learner||Train regression models to predict data using supervised machine learning|
|Create bag of decision trees|
|Fit ensemble of learners for regression|
|Predict responses using ensemble of bagged decision trees|
|Ensemble predictions for out-of-bag observations|
|Predict response quantile using bag of regression trees|
|Quantile predictions for out-of-bag observations from bag of regression trees|
Learn about different algorithms for ensemble learning.
Obtain highly accurate predictions by using many weak learners.
Train a simple regression ensemble.
Learn methods to evaluate the predictive quality of an ensemble.
Select split-predictors for random forests using interaction test algorithm.
Automatically choose fewer weak learners for an ensemble in a way that does not diminish predictive performance.
Create a TreeBagger ensemble for regression.
Speed up computation by running TreeBagger in parallel.
Detect outliers in data using quantile random forest.
Estimate conditional quantiles of a response given predictor data using quantile random forest and by estimating the conditional distribution function of the response using kernel smoothing.
Tune quantile random forest using Bayesian optimization.