A regression tree ensemble is a predictive model composed of a weighted combination of multiple regression trees. In general, combining multiple regression trees increases predictive performance. To boost regression trees using LSBoost, use
fitrensemble. To bag regression trees or to grow a random forest , use
TreeBagger. To implement quantile regression using a bag of regression trees, use
For classification ensembles, such as boosted or bagged classification trees, random subspace ensembles, or error-correcting output codes (ECOC) models for multiclass classification, see Classification Ensembles.
|Regression Learner||Train regression models to predict data using supervised machine learning|
|RegressionEnsemble Predict||Predict responses using ensemble of decision trees for regression|
|Create bag of decision trees|
|Fit ensemble of learners for regression|
|Predict responses using ensemble of bagged decision trees|
|Ensemble predictions for out-of-bag observations|
|Predict response quantile using bag of regression trees|
|Quantile predictions for out-of-bag observations from bag of regression trees|
|Local interpretable model-agnostic explanations (LIME)|
|Compute partial dependence|
|Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots|
|Estimates of predictor importance for regression ensemble|
Learn about different algorithms for ensemble learning.
Obtain highly accurate predictions by using many weak learners.
Train a simple regression ensemble.
Learn methods to evaluate the predictive quality of an ensemble.
Select split-predictors for random forests using interaction test algorithm.
Automatically choose fewer weak learners for an ensemble in a way that does not diminish predictive performance.
TreeBagger ensemble for regression.
Speed up computation by running
Detect outliers in data using quantile random forest.
Estimate conditional quantiles of a response given predictor data using quantile random forest and by estimating the conditional distribution function of the response using kernel smoothing.
Tune quantile random forest using Bayesian optimization.
Train a regression ensemble model with optimal hyperparameters, and then use the RegressionEnsemble Predict block for response prediction.