M = margin(ens,X,Y)
M = margin(ens,X,Y,Name,Value)
Matrix of data to classify. Each row of X represents one observation, and each column represents one predictor. X must have the same number of columns as the data used to train ens. X should have the same number of rows as the number of elements in Y.
Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.
Indices of weak learners in the ensemble ranging from 1 to ens.NumTrained. oobEdge uses only these learners for calculating loss.
A logical matrix of size N-by-T, where:
When UseObsForLearner(i,j) is true, learner j is used in predicting the class of row i of X.
A numeric column vector with the same number of rows as X. Each row of M gives the classification margin for that row of X.
The classification margin is the difference between the classification score for the true class and maximal classification score for the false classes. Margin is a column vector with the same number of rows as in the matrix X.
For ensembles, a classification score represents the confidence of a classification into a class. The higher the score, the higher the confidence.
Different ensemble algorithms have different definitions for their scores. Furthermore, the range of scores depends on ensemble type. For example:
AdaBoostM1 scores range from –∞ to ∞.
Bag scores range from 0 to 1.
Find the margin for classifying an average flower from the Fisheriris data as 'versicolor':
load fisheriris % X = meas, Y = species ens = fitensemble(meas,species,'AdaBoostM2',100,'Tree'); flower = mean(meas); predict(ens,flower) ans = 'versicolor' margin(ens,mean(meas),'versicolor') ans = 3.2140