Resume training ensemble
ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)
ens1 = resume(
nlearn more cycles.
resume uses the same training options
fitrensemble used to create
ens, except for parallel
training options. If you want to resume training in parallel, pass the
'Options' name-value pair.
A regression ensemble, created with
A positive integer, the number of cycles for additional training of
comma-separated pairs of
the argument name and
Value is the corresponding value.
Name must appear inside quotes. You can specify several name and value
pair arguments in any order as
Printout frequency, a positive integer scalar or
For fastest training of some boosted decision trees, set
Options for computing in parallel and setting random numbers, specified as a structure. Create
You need Parallel Computing Toolbox™ to compute in parallel.
You can use the same parallel options for
For dual-core systems and above,
The regression ensemble
Train a regression ensemble for 50 cycles, and compare the resubstitution error obtained after training the ensemble for more cycles.
carsmall data set and select displacement, horsepower, and vehicle weight as predictors.
load carsmall X = [Displacement Horsepower Weight];
Train a regression ensemble for 50 cycles and examine the resubstitution error.
ens = fitrensemble(X,MPG,'NumLearningCycles',50); L = resubLoss(ens)
L = 0.5563
Train for 50 more cycles and examine the new resubstitution error.
ens = resume(ens,50); L = resubLoss(ens)
L = 0.3463
The resubstitution error is lower in the new ensemble than in the original.
resume supports parallel training
'Options' name-value argument. Create options using
statset, such as
options = statset('UseParallel',true).
Parallel ensemble training requires you to set the
'Bag'. Parallel training is available only for tree learners, the
default type for