oobLoss

Class: ClassificationBaggedEnsemble

Out-of-bag classification error

Syntax

L = oobloss(ens)
L = oobloss(ens,Name,Value)

Description

L = oobloss(ens) returns the classification error for ens computed for out-of-bag data.

L = oobloss(ens,Name,Value) computes error with additional options specified by one or more Name,Value pair arguments. You can specify several name-value pair arguments in any order as Name1,Value1,…,NameN,ValueN.

Input Arguments

ens

A classification bagged ensemble, constructed with fitensemble.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

'learners'

Indices of weak learners in the ensemble ranging from 1 to NumTrained. oobLoss uses only these learners for calculating loss.

Default: 1:NumTrained

'lossfun'

Function handle or string representing a loss function. Built-in loss functions:

You can write your own loss function in the syntax described in Loss Functions.

Default: 'classiferror'

'mode'

String representing the meaning of the output L:

  • 'ensemble'L is a scalar value, the loss for the entire ensemble.

  • 'individual'L is a vector with one element per trained learner.

  • 'cumulative'L is a vector in which element J is obtained by using learners 1:J from the input list of learners.

Default: 'ensemble'

Output Arguments

L

Classification error of the out-of-bag observations, a scalar. L can be a vector, or can represent a different quantity, depending on the name-value settings.

Definitions

Out of Bag

Bagging, which stands for "bootstrap aggregation", is a type of ensemble learning. To bag a weak learner such as a decision tree on a dataset, fitensemble generates many bootstrap replicas of the dataset and grows decision trees on these replicas. fitensemble obtains each bootstrap replica by randomly selecting N observations out of N with replacement, where N is the dataset size. To find the predicted response of a trained ensemble, predict take an average over predictions from individual trees.

Drawing N out of N observations with replacement omits on average 37% (1/e) of observations for each decision tree. These are "out-of-bag" observations. For each observation, oobLoss estimates the out-of-bag prediction by averaging over predictions from all trees in the ensemble for which this observation is out of bag. It then compares the computed prediction against the true response for this observation. It calculates the out-of-bag error by comparing the out-of-bag predicted responses against the true responses for all observations used for training. This out-of-bag average is an unbiased estimator of the true ensemble error.

Loss Functions

The built-in loss functions are:

  • 'binodeviance' — For binary classification, assume the classes yn are -1 and 1. With weight vector w normalized to have sum 1, and predictions of row n of data X as f(Xn), the binomial deviance is

    wnlog(1+exp(2ynf(Xn))).

  • 'classiferror' — Fraction of misclassified data, weighted by w.

  • 'exponential' — With the same definitions as for 'binodeviance', the exponential loss is

    wnexp(ynf(Xn)).

  • 'hinge' — Classification error measure that has the form

    L=j=1nwjmax{0,1yjf(Xj)}j=1nwj,

    where:

    • wj is weight j.

    • For binary classification, yj = 1 for the positive class and -1 for the negative class. For problems where the number of classes K > 3, yj is a vector of 0s, but with a 1 in the position corresponding to the true class, e.g., if the second observation is in the third class and K = 4, then y2 = [0 0 1 0]′.

    • f(Xj) is, for binary classification, the posterior probability or, for K > 3, a vector of posterior probabilities for each class given observation j.

  • 'mincost' — Predict the label with the smallest expected misclassification cost, with expectation taken over the posterior probability, and cost as given by the Cost property of the classifier (a matrix). The loss is then the true misclassification cost averaged over the observations.

To write your own loss function, create a function file of the form

function loss = lossfun(C,S,W,COST)
  • N is the number of rows of X.

  • K is the number of classes in tree, represented in tree.ClassNames.

  • C is an N-by-K logical matrix, with one true per row for the true class. The index for each class is its position in tree.ClassNames.

  • S is an N-by-K numeric matrix. S is a matrix of posterior probabilities for classes with one row per observation, similar to the posterior output from predict.

  • W is a numeric vector with N elements, the observation weights.

  • COST is a K-by-K numeric matrix of misclassification costs. The default 'classiferror' cost function uses a cost of 0 for correct classification, and 1 for misclassification. In other words, 'classiferror' uses COST=ones(K)-eye(K).

  • The output loss should be a scalar.

Pass the function handle @lossfun as the value of the lossfun name-value pair.

Examples

Find the out-of-bag error for a bagged ensemble from the Fisher iris data:

load fisheriris
ens = fitensemble(meas,species,'Bag',100,...
    'Tree','type','classification');
L = oobLoss(ens)

L =
    0.0467
Was this topic helpful?