Documentation Center

  • Trial Software
  • Product Updates

CompactClassificationSVM class

Compact support vector machine for binary classification

Description

CompactClassificationSVM is a compact support vector machine (SVM) classifier.

The compact classifier does not include the data used for training the SVM classifier. Therefore, you cannot perform tasks, such as cross validation, using the compact classifier.

Use a compact SVM classifier for labeling new data (i.e., predicting the class of new data).

Construction

CompactSVMModel = compact(SVMModel) returns a compact SVM classifier (CompactSVMModel) from a full, trained support vector machine classifier (SVMModel).

Input Arguments

SVMModel

A full, trained ClassificationSVM classifier trained by fitcsvm.

Properties

Alpha

Numeric vector of trained classifier coefficients from the dual problem (i.e., the estimated Lagrange multipliers). Alpha has length equal to the number of support vectors in the trained classifier (i.e., sum(SVMModel.IsSupportVector)).

Beta

Numeric vector of trained classifier coefficients from the primal linear problem. Beta has length equal to the number of predictors (i.e., size(SVMModel.X,2)).

Beta = [] for one-class learning or two-class learning using a nonlinear kernel function.

Bias

Scalar corresponding to the trained classifier bias term.

CategoricalPredictors

List of categorical predictors, which is always empty ([]) for SVM and discriminant analysis classifiers.

ClassNames

List of elements in Y with duplicates removed. ClassNames has the same data type as the data in the argument Y, and therefore can be a categorical or character array, logical or numeric vector, or cell array of strings.

Cost

Square matrix, where Cost(i,j) is the cost of classifying a point into class j if its true class is i.

During training, the software updates the prior probabilities by incorporating the penalties described in the cost matrix. Therefore,

  • For two-class learning, Cost always has this form: Cost(i,j) = 1 if i ~= j, and Cost(i,j) = 0 if i = j.

  • For one-class learning, Cost = 0.

For more details, see Algorithms.

KernelParameters

Structure array containing the kernel name and parameter values.

To display the values of KernelParameters, use dot notation, e.g., SVMModel.KernelParameters.Scale displays the scale parameter value.

The software accepts KernelParameters as inputs, and does not modify them. Alter KernelParameters by setting the appropriate name-value pair arguments when you train the SVM classifier using fitcsvm.

Mu

Numeric vector of predictor means.

If you specify 'Standardize',1 or 'Standardize',true when you train an SVM classifier using fitcsvm, then Mu has length equal to the number of predictors (i.e., size(SVMModel.X,2)). Otherwise, Mu is an empty vector ([]).

PredictorNames

Cell array of strings containing the predictor names, in the order that they appear in X.

Prior

Numeric vector of prior probabilities for each class. The order of the elements of Prior corresponds to the elements of SVMModel.ClassNames.

For two-class learning, if you specify a cost matrix, then the software updates the prior probabilities by incorporating the penalties described in the cost matrix. For more details, see Algorithms.

ScoreTransform

String representing a built-in transformation function, or a function handle for transforming predicted classification scores.

To change the score transformation function to, e.g., function, use dot notation.

  • For a built-in function, enter a string.

    SVMModel.ScoreTransform = 'function';

    This table contains the available, built-in functions.

    StringFormula
    'doublelogit'1/(1 + e–2x)
    'invlogit'log(x / (1–x))
    'ismax'Set the score for the class with the largest score to 1, and scores for all other classes to 0.
    'logit'1/(1 + ex)
    'none'x (no transformation)
    'sign'–1 for x < 0
    0 for x = 0
    1 for x > 0
    'symmetric'2x – 1
    'symmetriclogit'2/(1 + ex) – 1
    'symmetricismax'Set the score for the class with the largest score to 1, and scores for all other classes to -1.

  • For a MATLAB® function, or a function that you define, enter its function handle.

    SVMModel.ScoreTransform = @function;

    function should accept a matrix (the original scores) and return a matrix of the same size (the transformed scores).

Sigma

Numeric vector of predictor standard deviations.

If you specify 'Standardize',1 or 'Standardize',true when you train the SVM classifier, then Sigma has length equal to the number of predictors (i.e., size(SVMModel.X,2)). Otherwise, Sigma is an empty vector ([]).

SupportVectors

Matrix containing rows of X that the software considers the support vectors.

If you specify 'Standardize',1 or 'Standardize',true, then SupportVectors are the standardized rows of X.

SupportVectorLabels

Numeric vector of support vector class labels. SupportVectorLabels has length equal to the number of support vectors (i.e., sum(SVMModel.IsSupportVector)).

+1 indicates that the corresponding support vector is in the positive class (SVMModel.ClassNames{2}). -1 indicates that the corresponding support vector is in the negative class (SVMModel.ClassNames{1}).

Methods

edgeClassification edge
fitPosteriorFit posterior probabilities
lossClassification error
marginClassification margins
predictPredict classification

Copy Semantics

Value. To learn how value classes affect copy operations, see Copying Objects in the MATLAB documentation.

Examples

expand all

Reduce the Size of Support Vector Machine Classifiers

Full SVM classifiers (i.e., ClassificationSVM classifiers) hold the training data. For efficiency, you might not want to predict new labels using a large classifier, for example. This example shows how to reduce the size of a full SVM classifier.

Load the ionosphere data set.

load ionosphere

Train an SVM classifier. It is good practice to standardize the predictors and specify the order of the classes.

SVMModel = fitcsvm(X,Y,'Standardize',true,...
    'ClassNames',{'b','g'})
SVMModel = 

  ClassificationSVM
      PredictorNames: {1x34 cell}
        ResponseName: 'Y'
          ClassNames: {'b'  'g'}
      ScoreTransform: 'none'
     NumObservations: 351
               Alpha: [89x1 double]
                Bias: -0.1341
    KernelParameters: [1x1 struct]
                  Mu: [1x34 double]
               Sigma: [1x34 double]
      BoxConstraints: [351x1 double]
     ConvergenceInfo: [1x1 struct]
     IsSupportVector: [351x1 logical]
              Solver: 'SMO'


SVMModel is a ClassificationSVM classifier.

Reduce the size of the SVM classifier.

CompactSVMModel = compact(SVMModel)
CompactSVMModel = 

  classreg.learning.classif.CompactClassificationSVM
         PredictorNames: {1x34 cell}
           ResponseName: 'Y'
             ClassNames: {'b'  'g'}
         ScoreTransform: 'none'
                  Alpha: [89x1 double]
                   Bias: -0.1341
       KernelParameters: [1x1 struct]
                     Mu: [1x34 double]
                  Sigma: [1x34 double]
         SupportVectors: [89x34 double]
    SupportVectorLabels: [89x1 double]


CompactSVMModel is a CompactClassificationSVM classifier.

Display how much memory each classifier uses.

whos('SVMModel','CompactSVMModel')
  Name                 Size             Bytes  Class                                                 Attributes

  CompactSVMModel      1x1              29850  classreg.learning.classif.CompactClassificationSVM              
  SVMModel             1x1             140827  ClassificationSVM                                               

The full SVM classifier (SVMModel) is more than four times the compact SVM classifier (CompactSVMModel).

You can remove SVMModel from the MATLAB® Workspace, and pass CompactSVMModel and new predictor values to predict to efficiently label new observations.

Train and Cross Validate Support Vector Machine Classifiers

Load the ionosphere data set.

load ionosphere

Train and cross validate an SVM classifier. It is good practice to standardize the predictors and specify the order of the classes.

rng(1);  % For reproducibility
CVSVMModel = fitcsvm(X,Y,'Standardize',true,...
    'ClassNames',{'b','g'},'CrossVal','on')
CVSVMModel = 

  classreg.learning.partition.ClassificationPartitionedModel
      CrossValidatedModel: 'SVM'
           PredictorNames: {1x34 cell}
    CategoricalPredictors: []
             ResponseName: 'Y'
          NumObservations: 351
                    KFold: 10
                Partition: [1x1 cvpartition]
               ClassNames: {'b'  'g'}
           ScoreTransform: 'none'


CVSVMModel is not a ClassificationSVM classifier, but a ClassificationPartitionedModel cross-validated SVM classifier. By default, the software implements 10-fold cross validation.

Alternatively, you can cross validate a trained ClassificationSVM classifier by passing it to the crossval method.

Inspect one of the trained folds using dot notation.

CVSVMModel.Trained{1}
ans = 

  classreg.learning.classif.CompactClassificationSVM
         PredictorNames: {1x34 cell}
           ResponseName: 'Y'
             ClassNames: {'b'  'g'}
         ScoreTransform: 'none'
                  Alpha: [78x1 double]
                   Bias: -0.2209
       KernelParameters: [1x1 struct]
                     Mu: [1x34 double]
                  Sigma: [1x34 double]
         SupportVectors: [78x34 double]
    SupportVectorLabels: [78x1 double]


Each fold is a CompactClassificationSVM classifier trained on 90% of the data.

Estimate the generalization error.

genError = kfoldLoss(CVSVMModel)
genError =

    0.1168

On average, the generalization error is approximately 12%.

References

[1] Hastie, T., R. Tibshirani, and J. Friedman. The Elements of Statistical Learning, Second Edition. NY: Springer, 2008.

[2] Scholkopf, B., J. C. Platt, J. C. Shawe-Taylor, A. J. Smola, and R. C. Williamson. "Estimating the Support of a High-Dimensional Distribution." Neural Comput., Vol. 13, Number 7, 2001, pp. 1443–1471.

[3] Christianini, N., and J. C. Shawe-Taylor. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge, UK: Cambridge University Press, 2000.

[4] Scholkopf, B. and A. Smola. Learning with Kernels: Support Vector Machines, Regularization, Optimization and Beyond, Adaptive Computation and Machine Learning Cambridge, MA: The MIT Press, 2002.

See Also

| |

More About

Was this topic helpful?