This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.


Class: ClassificationPartitionedEnsemble

Resume training learners on cross-validation folds


ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)


ens1 = resume(ens,nlearn) trains ens in every fold for nlearn more cycles. resume uses the same training options fitensemble used to create ens.

ens1 = resume(ens,nlearn,Name,Value) trains ens with additional options specified by one or more Name,Value pair arguments.

Input Arguments


A cross-validated classification ensemble. ens is the result of either:

  • The fitensemble function with a cross-validation name-value pair. The names are 'crossval', 'kfold', 'holdout', 'leaveout', or 'cvpartition'.

  • The crossval method applied to a classification ensemble.


A positive integer, the number of cycles for additional training of ens.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.


Printout frequency, a positive integer scalar or 'off' (no printouts). Returns to the command line the number of weak learners trained so far. Useful when you train ensembles with many learners on large data sets.

Default: 'off'

Output Arguments


The cross-validated classification ensemble ens, augmented with additional training.


Train a partitioned classification ensemble for 10 cycles. Examine the error. Then train for 10 more cycles and examine the new error.

load ionosphere
cvens = fitensemble(X,Y,'GentleBoost',10,'Tree',...
L = kfoldLoss(cvens)

L =

cvens = resume(cvens,10);
L = kfoldLoss(cvens)

L =

The ensemble has less cross-validation error after training for ten more cycles.

Was this topic helpful?