E = edge(ens,X,Y)
E = edge(ens,X,Y,Name,Value)
A matrix where each row represents an observation, and each column represents a predictor. The number of columns in X must equal the number of predictors in ens.
Class labels, with the same data type as exists in ens. The number of elements of Y must equal the number of rows of X.
Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.
Indices of weak learners in the ensemble ranging from 1 to ens.NumTrained. edge uses only these learners for calculating loss.
String representing the meaning of the output E:
A logical matrix of size N-by-T, where:
When UseObsForLearner(i,j) is true, learner j is used in predicting the class of row i of X.
Observation weights, a numeric vector of length size(X,1). If you supply weights, edge computes weighted classification edge.
The classification edge, a vector or scalar depending on the setting of the mode name-value pair. Classification edge is weighted average classification margin.
The classification margin is the difference between the classification score for the true class and maximal classification score for the false classes. Margin is a column vector with the same number of rows as in the matrix X.
For ensembles, a classification score represents the confidence of a classification into a class. The higher the score, the higher the confidence.
Different ensemble algorithms have different definitions for their scores. Furthermore, the range of scores depends on ensemble type. For example:
AdaBoostM1 scores range from –∞ to ∞.
Bag scores range from 0 to 1.
The edge is the weighted mean value of the classification margin. The weights are the class probabilities in ens.Prior. If you supply weights in the weights name-value pair, those weights are used instead of class probabilities.
Make a boosted ensemble classifier for the ionosphere data, and find the classification edge for the last few rows:
load ionosphere ens = fitensemble(X,Y,'AdaboostM1',100,'Tree'); E = edge(ens,X(end-10:end,:),Y(end-10:end)) E = 8.3310