## Support Vector Machine (SVM) |

A support vector machine (SVM) is a supervised learning algorithm that can be used for binary classification or regression. Support vector machines are popular in applications such as natural language processing, speech and image recognition, and computer vision.

A support vector machine constructs an optimal hyperplane as a decision surface such that the margin of separation between the two classes in the data is maximized. Support vectors refer to a small subset of the training observations that are used as support for the optimal location of the decision surface.

Support vector machines fall under a class of machine learning algorithms called kernel methods and are also referred to as kernel machines.

Training for a support vector machine has two phases:

- Transform predictors (input data) to a high-dimensional feature space. It is sufficient to just specify the kernel for this step and the data is never explicitly transformed to the feature space. This process is commonly known as the kernel trick.
- Solve a quadratic optimization problem to fit an optimal hyperplane to classify the transformed features into two classes. The number of transformed features is determined by the number of support vectors.

Only the support vectors chosen from the training data are required to construct the decision surface. Once trained, the rest of the training data are irrelevant.

Popular kernels used with SVMs include:

Type of SVM | Mercer Kernel | Description |
---|---|---|

Gaussian or Radial Basis Function (RBF) | $K\left({x}_{1},{x}_{2}\right)=\mathrm{exp}(-\frac{{\mathrm{||}{x}_{1}-{x}_{2}\mathrm{||}}^{2}}{{\mathrm{2\sigma}}^{2}})$ | One class learning. $\sigma $ is the width of the kernel |

Linear | $K\left({x}_{1},{x}_{2}\right)={x}_{1}^{T}{x}_{2}$ | Two class learning. |

Polynomial | $K\left({x}_{1},{x}_{2}\right)={({x}_{1}^{T}{x}_{2}+1)}^{p}$ | $p$ is the order of the polynomial |

Sigmoid | $K\left({x}_{1},{x}_{2}\right)=\mathrm{tanh}({\beta}_{0}{x}_{1}^{T}{x}_{2}+{\beta}_{1})$ | It is a mercer kernel for certain ${\beta}_{0}$ and ${\beta}_{1}$ values only |

For more on how to fit support vector machine classifiers, see Statistics Toolbox™ for use with MATLAB^{®}.

- Machine Learning with MATLAB: Getting Started with Classification 34:31 (Webinar)
- Machine Learning with MATLAB 41:25 (Webinar)
- An Introduction to Classification 9:00 (Video)
- Machine Learning with MATLAB 3:02 (Video)
- Train and Cross-Validate SVM Classifiers (Example)

- Support Vector Machines (SVM) (Documentation)
- fitcsvm: Train binary support vector machine classifier (Function)
- fitSVMPosterior: Fit posterior probabilities (Function)

*See also*: *Statistics Toolbox*, *Neural Network Toolbox*, *machine learning*, *unsupervised learning*, *AdaBoost*, *supervised learning*