fitPosterior

Class: CompactClassificationSVM

Fit posterior probabilities

Syntax

  • ScoreSVMModel = fitPosterior(SVMModel,X,Y) example
  • [ScoreSVMModel,ScoreTransform] = fitPosterior(SVMModel,X,Y) example

Description

example

ScoreSVMModel = fitPosterior(SVMModel,X,Y) returns a trained support vector machine (SVM) classifier ScoreSVMModel containing the optimal score-to-posterior-probability transformation function for two-class learning.

The software fits the appropriate score-to-posterior-probability transformation function using the SVM classifier SVMModel, and by conducting 10-fold cross validation using the stored predictor data (SVMModel.X) and the class labels (SVMModel.Y) as outlined in [1]. The transformation function computes the posterior probability that an observation is classified into the positive class (SVMModel.Classnames(2)).

  • If the classes are inseparable, then the transformation function is the sigmoid function.

  • If the classes are perfectly separable, the transformation function is the step function.

  • In two-class learning, if one of the two classes has a relative frequency of 0, then the transformation function is the constant function. fitPosterior is not appropriate for one-class learning.

  • The software stores the optimal score-to-posterior-probability transformation function in ScoreSVMModel.ScoreTransform.

example

[ScoreSVMModel,ScoreTransform] = fitPosterior(SVMModel,X,Y) additionally returns the optimal score-to-posterior-probability transformation function parameters (ScoreTransform)

Tips

Here is one way to predict positive class posterior probabilities.

  1. Train an SVM classifier by passing the data to fitcsvm. The result is a trained SVM classifier, such as, SVMModel, that stores the data. The software sets the score transformation function property (SVMModel.ScoreTransformation) to none.

  2. Pass the trained SVM classifier SVMModel to fitSVMPosterior or fitPosterior. The result, for example, ScoreSVMModel, is the same, trained SVM classifier as SVMModel, except the software sets ScoreSVMModel.ScoreTransformation to the optimal score transformation function.

    If you skip step 2, then predict returns the positive class score rather than the positive class posterior probability.

  3. Pass the trained SVM classifier containing the optimal score transformation function (ScoreSVMModel) and predictor data matrix to predict. The second column of the second output argument stores the positive class posterior probabilities corresponding to each row of the predictor data matrix.

Input Arguments

expand all

SVMModel — Trained, compact SVM classifierCompactClassificationSVM classifier

Trained, compact SVM classifier, specified as a CompactClassificationSVM.

X — Predictor datamatrix

Predictor data used to estimate the score-to-posterior-probability transformation function, specified as a matrix.

Each row of X corresponds to one observation (also known as an instance or example), and each column corresponds to one variable (also known as a feature).

The length of Y and the number of rows of X must be equal.

If you set 'Standardize',true in fitcsvm to train SVMModel, then the software standardizes the columns of X using the corresponding means in SVMModel.Mu and standard deviations in SVMModel.Sigma. If the software fits the transformation-function parameter estimates using standardized data, then the estimates might differ from estimation without standardized data.

Data Types: double | single

Y — Class labelscategorical array | character array | logical vector | vector of numeric values | cell array of strings

Class labels used to estimate the score-to-posterior-probability transformation function, specified as a categorical or character array, logical or numeric vector, or cell array of strings.

If Y is a character array, then each element must correspond to one class label.

The length of Y and the number of rows of X must be equal.

Output Arguments

expand all

ScoreSVMModel — Trained, compact SVM classifierCompactClassificationSVM classifier

Trained, compact SVM classifier containing the estimated score-to-posterior-probability transformation function, returned as a CompactClassificationSVM classifier.

To estimate posterior probabilities, pass ScoreSVMModel and predictor data to predict. If you set 'Standardize',true in fitcsvm to train SVMModel, then predict standardizes the columns of X using the corresponding means in SVMModel.Mu and standard deviations in SVMModel.Sigma.

ScoreTransform — Optimal score transformation function parametersstructure array

Optimal score-to-posterior-probability transformation function parameters, returned as a structure array.

  • If field Type is sigmoid, then ScoreTransform has the following other fields:

    • Slope: The value of A in the sigmoid function

    • Intercept: The value of B in the sigmoid function

  • If field Type is step, then ScoreTransform has the following other fields:

    • PositiveClassProbability: The value of π in the step function. It represents the probability that an observation is in the positive class. Also, the posterior probability that an observation is in the positive class given that its score is in the interval (LowerBound,UpperBound).

    • LowerBound: The value maxyn=1sn in the step function. It represents the lower bound of the score interval that assigns observations with scores in the interval the posterior probability of being in the positive class PositiveClassProbability. Any observation with a score less than LowerBound has the posterior probability of being the positive class 0.

    • UpperBound: The value minyn=+1sn in the step function. It represents the upper bound of the score interval that assigns observations with scores in the interval the posterior probability of being in the positive class PositiveClassProbability. Any observation with a score greater than UpperBound has the posterior probability of being the positive class 1.

  • If field Type is constant, then ScoreTransform.PredictedClass contains the name of the class prediction.

    This result is the same as SVMModel.ClassNames. The posterior probability of an observation being in ScoreTransform.PredictedClass is always 1.

    Definitions

    Sigmoid Function

    The sigmoid function that maps score sj corresponding to observation j to the positive class posterior probability is

    P(sj)=11+exp(Asj+B).

    If the output argument ScoreTransform.Type is sigmoid, then parameters A and B correspond to the fields Scale and Intercept of ScoreTransform, respectively.

    Step Function

    The step function that maps score sj corresponding to observation j to the positive class posterior probability is

    P(sj)={0;s<maxyk=1skπ;maxyk=1sksjminyk=+1sk1;sj>minyk=+1sk,

    where:

    • sj the score of observation j.

    • +1 and –1 denote the positive and negative classes, respectively.

    • π is the prior probability that an observation is in the positive class.

    If the output argument ScoreTransform.Type is step, then the quantities maxyk=1sk and minyk=+1skcorrespond to the fields LowerBound and UpperBound of ScoreTransform, respectively.

    Constant Function

    The constant function maps all scores in a sample to posterior probabilities 1 or 0.

    If all observations have posterior probability 1, then they are expected to come from the positive class.

    If all observations have posterior probability 0, then they are not expected to come from the positive class.

    Examples

    expand all

    Estimate Posterior Probabilities for New Data When Classes Are Inseparable

    Load the ionosphere data set. Reserve 20 random observations of the data, and consider this set new data.

    load ionosphere
    n = size(X,1);
    rng(1);  % For reproducibility
    
    indx = ~ismember([1:n],randsample(n,20)); % Indices for the training data
    

    The classes of this data set are inseparable.

    Train an SVM classifier using the training data. It is good practice to specify the class order and standardize the data.

    SVMModel = fitcsvm(X(indx,:),Y(indx),'ClassNames',{'b','g'},...
        'Standardize',true);
    

    SVMModel is a ClassificationSVM classifier.

    Use the new data set to estimate the optimal score-to-posterior-probability transformation function for mapping scores to the posterior probability of an observation being classified as g. For efficiency, make a compact version of the SVM classifier SVMModel, and pass it and the new data to fitPosterior.

    CompactSVMModel = compact(SVMModel);
    [ScoreCSVMModel,ScoreParameters] = fitPosterior(CompactSVMModel,...
        X(~indx,:),Y(~indx));
    
    ScoreTransform = ScoreCSVMModel.ScoreTransform
    ScoreParameters
    
    ScoreTransform =
    
    @(S)sigmoid(S,-1.098769e+00,4.518289e-01)
    
    
    ScoreParameters = 
    
             Type: 'sigmoid'
            Slope: -1.0988
        Intercept: 0.4518
    
    

    ScoreTransform is the optimal score transform function. ScoreParameters is a structure array having three fields: the score transformation function name (Type), the sigmoid slope (Slope) and sigmoid intercept (Intercept) estimates.

    Alternatively, you can pass SVMModel and the new data to fitSVMPosterior, but this does not have the benefit of efficiency.

    Estimate the posterior probabilities that the observations in the new data are in class g.

    [labels,postProbs] = predict(ScoreCSVMModel,X(~indx,:));
    table(Y(~indx),labels,postProbs(:,2),...
        'VariableNames',{'TrueLabel','PredictedLabel','PosteriorProbability'})
    
    ans = 
    
        TrueLabel    PredictedLabel    PosteriorProbability
        _________    ______________    ____________________
    
        'g'          'g'                  0.78438          
        'b'          'b'                 0.024604          
        'g'          'g'                  0.82398          
        'b'          'b'                0.0061732          
        'b'          'b'               3.6273e-06          
        'b'          'b'                  0.15692          
        'b'          'g'                  0.96221          
        'b'          'b'               6.1526e-09          
        'b'          'b'                0.0019667          
        'g'          'g'                  0.72507          
        'g'          'g'                  0.70261          
        'b'          'b'                 0.075319          
        'g'          'g'                  0.90688          
        'g'          'g'                  0.82845          
        'b'          'b'                 0.051213          
        'g'          'g'                  0.95329          
        'b'          'b'                0.0031914          
        'b'          'b'                  0.16019          
        'g'          'g'                  0.92007          
        'g'          'g'                  0.91348          
    
    

    Estimate Posterior Probabilities for New Data When Classes Are Separable

    Load Fisher's iris data set. Use the petal lengths and widths, and remove the virginica species from the data. Reserve 10 random observations of the data, and consider this set new data.

    load fisheriris
    classKeep = ~strcmp(species,'virginica');
    X = meas(classKeep,3:4);
    Y = species(classKeep);
    
    rng(1);  % For reproducibility
    indx1 = 1:numel(species);
    indx2 = indx1(classKeep);
    indx = ~ismember(indx2,randsample(indx2,10)); % Indices for the training data
    
    gscatter(X(indx,1),X(indx,2),Y(indx));
    title('Scatter Diagram of Iris Measurements')
    xlabel('Petal length')
    ylabel('Petal width')
    legend('Setosa','Versicolor')
    

    The classes are perfectly separable. Therefore, the score-to-posterior-probability transformation function is a step function.

    Train an SVM classifier. It is good practice to specify the class order and standardize the data.

    SVMModel = fitcsvm(X(indx,:),Y(indx),...
        'ClassNames',{'setosa','versicolor'},'Standardize',true);
    

    SVMModel is a ClassificationSVM classifier.

    Use the new data set to estimate the optimal score-to-posterior-probability transformation function for mapping scores to the posterior probability of an observation being classified as versicolor. For efficiency, make a compact version of the SVM classifier SVMModel, and pass it and the new data to fitPosterior.

    CompactSVMModel = compact(SVMModel);
    [ScoreCSVMModel,ScoreParameters] = fitPosterior(CompactSVMModel,...
        X(~indx,:),Y(~indx));
    
    ScoreTransform = ScoreCSVMModel.ScoreTransform
    
    Warning: Classes are perfectly separated. The optimal score-to-posterior
    transformation is a step function. 
    
    ScoreTransform =
    
    @(S)step(S,-1.338450e+00,2.012495e+00,5.333333e-01)
    
    

    fitPosterior displays a warning whenever the classes are separable, and stores the step function in ScoreSVMModel.ScoreTransform.

    Display the score function type and its estimated values.

    ScoreParameters
    
    ScoreParameters = 
    
                            Type: 'step'
                      LowerBound: -1.3385
                      UpperBound: 2.0125
        PositiveClassProbability: 0.5333
    
    

    ScoreParameters is a structure array having four fields:

    • The score transformation function type (Type)

    • The score corresponding to negative class boundary (LowerBound)

    • The score corresponding to positive class boundary (UpperBound)

    • The positive class probability (PositiveClassProbability)

    Alternatively, you can pass SVMModel and the new data to fitSVMPosterior, but this does not have the benefit of efficiency.

    Estimate the posterior probabilities that the observations in the new data are versicolor irises.

    [labels,postProbs] = predict(ScoreCSVMModel,X(~indx,:));
    table(Y(~indx),labels,postProbs(:,2),...
        'VariableNames',{'TrueLabel','PredictedLabel','PosteriorProbability'})
    
    ans = 
    
         TrueLabel      PredictedLabel    PosteriorProbability
        ____________    ______________    ____________________
    
        'setosa'        'setosa'          0                   
        'setosa'        'setosa'          0                   
        'setosa'        'setosa'          0                   
        'setosa'        'setosa'          0                   
        'setosa'        'setosa'          0                   
        'setosa'        'setosa'          0                   
        'setosa'        'setosa'          0                   
        'setosa'        'setosa'          0                   
        'versicolor'    'versicolor'      1                   
        'versicolor'    'versicolor'      1                   
    
    

    Since the classes are separable, the step function transforms the positive-class score to:

    • 0, if the score is less than ScoreParameters.LowerBound

    • 1, if the score is greater than ScoreParameters.UpperBound

    • ScoreParameters.PositiveClassProbability, if the score is in the interval [ ScoreParameters.LowerBound , ScoreParameters.LowerBound]

    Algorithms

    If you reestimate the score-to-posterior-probability transformation function, that is, if you pass an SVM classifier to fitPosterior or fitSVMPosterior and its ScoreTransform property is not none, then the software:

    • Displays a warning

    • Resets the original transformation function to 'none' before estimating the new one

    Alternatives

    You can also estimate the optimal score-to-posterior-probability function using fitSVMPosterior. This function is similar to fitPosterior, except it is more broad since it accepts a wider range of SVM classifer types.

    References

    [1] Platt, J. "Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods". In: Advances in Large Margin Classifiers. Cambridge, MA: The MIT Press, 2000, pp. 61–74.

    Was this topic helpful?