resume

Class: RegressionPartitionedEnsemble

Resume training ensemble

Syntax

ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)

Description

ens1 = resume(ens,nlearn) trains ens in every fold for nlearn more cycles. resume uses the same training options fitensemble used to create ens.

ens1 = resume(ens,nlearn,Name,Value) trains ens with additional options specified by one or more Name,Value pair arguments.

Input Arguments

ens

A cross-validated regression ensemble. ens is the result of either:

  • The fitensemble function with a cross-validation name-value pair. The names are 'crossval', 'kfold', 'holdout', 'leaveout', or 'cvpartition'.

  • The crossval method applied to a regression ensemble.

nlearn

A positive integer, the number of cycles for additional training of ens.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

'nprint'

Printout frequency, a positive integer scalar or 'off' (no printouts). Returns to the command line the number of weak learners trained so far. Useful when you train ensembles with many learners on large data sets.

Default: 'off'

Output Arguments

ens1

The cross-validated regression ensemble ens, augmented with additional training.

Examples

Train a regression ensemble for 50 cycles, and cross validate it. Examine the cross-validation error. Then train for 50 more cycles and examine the new cross-validation error.

load carsmall
X = [Displacement Horsepower Weight];
ens = fitensemble(X,MPG,'LSBoost',50,'Tree');
cvens = crossval(ens);
L = kfoldLoss(cvens)

L =
   25.6573

cvens = resume(cvens,50);
L = kfoldLoss(cvens)

L =
   26.7563

The additional training did not improve the cross-validation error.

Was this topic helpful?