Quantcast

Documentation Center

  • Trial Software
  • Product Updates

resume

Class: ClassificationPartitionedEnsemble

Resume training learners on cross-validation folds

Syntax

ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)

Description

ens1 = resume(ens,nlearn) trains ens in every fold for nlearn more cycles. resume uses the same training options fitensemble used to create ens.

ens1 = resume(ens,nlearn,Name,Value) trains ens with additional options specified by one or more Name,Value pair arguments.

Input Arguments

ens

A cross-validated classification ensemble. ens is the result of either:

  • The fitensemble function with a cross-validation name-value pair. The names are 'crossval', 'kfold', 'holdout', 'leaveout', or 'cvpartition'.

  • The crossval method applied to a classification ensemble.

nlearn

A positive integer, the number of cycles for additional training of ens.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

'nprint'

Printout frequency, a positive integer scalar or 'off' (no printouts). Returns to the command line the number of weak learners trained so far. Useful when you train ensembles with many learners on large data sets.

Default: 'off'

Output Arguments

ens1

The cross-validated classification ensemble ens, augmented with additional training.

Examples

Train a partitioned classification ensemble for 10 cycles. Examine the error. Then train for 10 more cycles and examine the new error.

load ionosphere
cvens = fitensemble(X,Y,'GentleBoost',10,'Tree',...
    'crossval','on');
L = kfoldLoss(cvens)

L =
    0.0883

cvens = resume(cvens,10);
L = kfoldLoss(cvens)

L =
    0.0769

The ensemble has less cross-validation error after training for ten more cycles.

See Also

| | |

Was this topic helpful?