resume

Class: ClassificationSVM

Resume training support vector machine classifier

Syntax

  • UpdatedSVMModel = resume(SVMModel,numIter) example
  • UpdatedSVMModel = resume(SVMModel,numIter,Name,Value) example

Description

example

UpdatedSVMModel = resume(SVMModel,numIter) returns an updated support vector machine (SVM) classifier (UpdatedSVMModel) by training the support vector machine classifier SVMModel for numIter more iterations.

resume continues applying the training options that you set for fitcsvm to train SVMModel.

example

UpdatedSVMModel = resume(SVMModel,numIter,Name,Value) returns an updated support vector machine classifier (UpdatedSVMModel) with additional options specified by one or more Name,Value pair arguments.

Tips

If optimization has not converged and the solver is 'SMO' or 'ISDA', then try to resume training the SVM classifier.

Input Arguments

expand all

SVMModel — Full, trained SVM classifierClassificationSVM classifier

Full, trained SVM classifier, specified as a ClassificationSVM model trained using fitcsvm.

numIter — Number of iterationsPositive integer

Number of iterations to continue training the SVM classifier, specified as a positive integer.

Data Types: double

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

'Verbose' — Verbosity level0 | 1 | 2

Verbosity level, specified as the comma-separated pair consisting of 'Verbose' and either 0, 1, or 2. Verbose controls the amount of optimization information that the software displays to the Command Window and is saved as a structure to SVMModel.ConvergenceInfo.History.

This table summarizes the available verbosity level options.

ValueDescription
0The software does not display or save convergence information.
1The software displays diagnostic messages and saves convergence criteria every numprint iterations, where numprint is the value of the name-value pair argument 'NumPrint'.
2The software displays diagnostic messages and saves convergence criteria at every iteration.

By default, Verbose is the value that fitcsvm used to train SVMModel.

Example: 'Verbose',1

Data Types: double

'NumPrint' — Number of iterations between diagnostic message printoutsnonnegative integer

Number of iterations between diagnostic message printouts, specified as the comma-separated pair consisting of 'NumPrint' and a nonnegative integer.

If you set 'Verbose',1 and 'NumPrint',numprint, then the software displays all optimization diagnostic messages from SMO [1] and ISDA [2] every numprint iterations to the Command Window.

By default, NumPrint is the value that fitcsvm used to train SVMModel.

Example: 'NumPrint',500

Data Types: double

Output Arguments

expand all

UpdatedSVMModel — Updated SVM classifierClassificationSVM classifier

Updated SVM classifier, returned as a ClassificationSVM classifier.

Examples

expand all

Resume Training an SVM Classifier

If you trained an SVM classifier, and the solver failed to converge onto a solution, then you can resume training the classifier without having to restart the entire learning process.

Load the ionosphere data set.

load ionosphere
rng(1); % For reproducibility

Train an SVM classifier. For illustration, specify that the optimization routine uses at most 50 iterations.

SVMModel = fitcsvm(X,Y,'IterationLimit',50);
DidConverge = SVMModel.ConvergenceInfo.Converged
Reason = SVMModel.ConvergenceInfo.ReasonForConvergence
DidConverge =

     0


Reason =

NoConvergence

DidConverge = 0 indicates that the optimization routine did not converge onto a solution. Reason states the reaon why the routine did not converge. Therefore, SVMModel is a partially trained, SVM classifier.

Resume training the SVM classifier for another 1500 iterations.

UpdatedSVMModel = resume(SVMModel,1500);
DidConverge = UpdatedSVMModel.ConvergenceInfo.Converged
Reason = UpdatedSVMModel.ConvergenceInfo.ReasonForConvergence
DidConverge =

     1


Reason =

DeltaGradient

DidConverge indicates that the optimization routine converged onto a solution. Reason indicates that the gradient difference (DeltaGradient) reached its tolerance level (DelatGradientTolerance). Therefore, SVMModel is a fully trained SVM classifier.

Monitor Training of an SVM Classifier

Load the ionosphere data set.

load ionosphere

Train an SVM classifier. For illustration, specify that the optimization routine uses at most 100 iterations. Monitor the algorithm specifying that the software prints diagnostic inofrmation every 50 iterations.

SVMModel = fitcsvm(X,Y,'IterationLimit',100,'Verbose',1,'NumPrint',50);
|===================================================================================================================|
|   Iteration  | Set  |   Set Size   |  Feasibility  |     Delta     |      KKT      |  Number of   |   Objective   |
|              |      |              |      Gap      |    Gradient   |   Violation   |  Supp. Vec.  |               |
|===================================================================================================================|
|            0 |active|          351 |  9.971591e-01 |  2.000000e+00 |  1.000000e+00 |            0 |  0.000000e+00 |
|           50 |active|          351 |  8.214794e-01 |  3.736929e+00 |  3.801461e+00 |           60 | -3.628863e+01 |

SVM optimization did not converge to the required tolerance.

The software prints an iterative display to the Command Window. The printout indicates that the optimization routine has not converged onto a solution.

Estimate the resubstitution loss of the partially trained SVM classifier.

partialLoss = resubLoss(SVMModel)
partialLoss =

    0.1054

The training sample misclassification error is approximately 11%.

Resume training the classifier for another 1500 iterations. Specify that the software print diagnostic information every 250 iterations.

UpdatedSVMModel = resume(SVMModel,1500,'NumPrint',250)
|===================================================================================================================|
|   Iteration  | Set  |   Set Size   |  Feasibility  |     Delta     |      KKT      |  Number of   |   Objective   |
|              |      |              |      Gap      |    Gradient   |   Violation   |  Supp. Vec.  |               |
|===================================================================================================================|
|          250 |active|          351 |  7.916242e-01 |  1.688486e+00 |  4.751099e+00 |          100 | -7.654307e+01 |
|          500 |active|          351 |  7.984468e-01 |  8.478520e-02 |  3.906963e+00 |          102 | -7.819410e+01 |
|          750 |active|          351 |  7.990499e-01 |  3.149635e-02 |  3.892920e+00 |          103 | -7.820919e+01 |
|         1000 |active|          351 |  7.995528e-01 |  2.504696e-03 |  3.883938e+00 |          103 | -7.820958e+01 |
|         1100 |active|          351 |  7.995659e-01 |  9.773632e-04 |  3.883189e+00 |          103 | -7.820959e+01 |

 Exiting Active Set upon convergence due to DeltaGradient.

UpdatedSVMModel = 

  ClassificationSVM
      PredictorNames: {1x34 cell}
        ResponseName: 'Y'
          ClassNames: {'b'  'g'}
      ScoreTransform: 'none'
     NumObservations: 351
               Alpha: [103x1 double]
                Bias: -3.8827
    KernelParameters: [1x1 struct]
      BoxConstraints: [351x1 double]
     ConvergenceInfo: [1x1 struct]
     IsSupportVector: [351x1 logical]
              Solver: 'SMO'


The software resumes at iteration 1000, and uses the same verbosity level as you set when you trained the model using fitcsvm. The printout indicates that the algorithm converged. Therefore, UpdatedSVMModel is a fully trained ClassificationSVM classifier.

updatedLoss = resubLoss(UpdatedSVMModel)
updatedLoss =

    0.0769

The trainig sample misclassification error of the fully trained classifier is approximately 8%.

References

[1] Fan, R.-E., P.-H. Chen, and C.-J. Lin. "Working set selection using second order information for training support vector machines." Journal of Machine Learning Research, Vol 6, 2005, pp. 1889–1918.

[2] Kecman V., T. -M. Huang, and M. Vogt. "Iterative Single Data Algorithm for Training Kernel Machines from Huge Data Sets: Theory and Performance." In Support Vector Machines: Theory and Applications. Edited by Lipo Wang, 255–274. Berlin: Springer-Verlag, 2005.

Was this topic helpful?