(Removed) Classify using support vector machine (SVM)
Group = svmclassify(SVMStruct,Sample)
Group = svmclassify(SVMStruct,Sample,'Showplot',true)
classifies each row of the data in
Group = svmclassify(
Sample, a matrix of data, using the
information in a support vector machine classifier structure
created using the
svmtrain function. Like the training data used to
Sample is a matrix where each row
corresponds to an observation or replicate, and each column corresponds to a feature or
Sample must have the same number of columns as the
training data. This is because the number of columns defines the number of features.
Group indicates the group to which each row of
has been assigned.
Group = svmclassify(
Sample data in the figure created using the
Showplot property with the
svmtrain function. This
plot appears only when the data is two-dimensional.
Support vector machine classifier structure created using the
A matrix where each row corresponds to an observation or replicate, and each column
corresponds to a feature or variable. Therefore,
Describes whether to display a plot of the classification. Displays only for 2-D
problems. Follow with a Boolean argument:
Column vector with the same number of rows as
svmclassify function uses results from
svmtrain to classify vectors x according to the
where si are the support vectors, αi are the weights, b is the bias, and k is a kernel function. In the case of a linear kernel, k is the dot product. If c ≥ 0, then x is classified as a member of the first group, otherwise it is classified as a member of the second group.
Errors starting in R2018a
svmclassify have been removed.
Instead, use the
fitcsvm function to train a binary SVM
classifier, and use the object function
ClassificationSVM to predict labels. Several
differences between these functions require updates to your code.
fitcsvm function was introduced in R2014a as a new way to train
an SVM classifier for one-class or two-class learning.
a trained SVM classifier as a
ClassificationSVM is an object for accessing and performing operations on
the training data and storing configurations of trained models. The new features include the
providing several advantages over these functions, as described here.
The new functionality
Supports computation of soft classification scores
Supports fitting posterior probabilities
Has improved training speed, especially on big data with well-separated classes, by providing shrinkage
Allows a warm restart by accepting an initial α value
Allows training to resume after the maximum number of iterations is exceeded
Supports robust learning in the presence of outliers
ClassificationSVM is built on the same framework as
ClassificationKNN. Therefore, the syntax, options, and object
functions resemble those in the existing objects, including:
This table shows some typical usages of
svmclassify and how to
update your code to use
|Removed Functionality||Recommended Replacement|
For details, see Train SVM Classifier.
 Kecman, V., Learning and Soft Computing, MIT Press, Cambridge, MA. 2001.
 Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B., and Vandewalle, J., Least Squares Support Vector Machines, World Scientific, Singapore, 2002.
 Scholkopf, B., and Smola, A.J., Learning with Kernels, MIT Press, Cambridge, MA. 2002.
 Cristianini, N., and Shawe-Taylor, J. (2000). An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, First Edition (Cambridge: Cambridge University Press).