SVMs are a bit tricky. In this case, we show a linear SVM and illustrate its behaviour on some 2D data. This should be great for getting to grips with maximising geometric margins, support vectors, and the optimisation involved in computing an optimal separating hyperplane.
Data can be generated randomly (uniformly or from separate gaussians) over the 2D space, and an SVM or perceptron can be trained to find a separating line. Data points can be dragged around with the mouse, and the model (perceptron or SVM) will retrain in real-time as the point is dragged (observe that dragging non-support vector points will not change the SVM decision boundary). The weights/bias terms can also be adjusted by dragging (either the weight vector arrow to change weights, or the decision boundary to change the bias); it should be clear that no configuration of the weights will give a larger minimum margin than that computed by an SVM. The weights/bias can also be randomised to illustrate the effect of random initial weights on the converged solution to the perceptron algorithm.
The program requires some implementation of a QP solver or SVM algorithm. Therefore, you will need to have one of:
- The bioinformatics toolbox, which includes an svmtrain function
- The optimization toolbox, which includes a quadprog function
- The third-party library "libsvm", which includes an svmtrain function.
If one or more of these is in the matlab path, the program should just work. To add a custom SVM solution, refer to the code commentary in LinearClassifier.
Code is extensively commented and documented. There are a number of outrageously obfuscated uses of arrayfun that may be of interest to people who enjoy incomprehensible code. The user interface code doesn't follow the preferred design pattern for Matlab GUI code because I didn't know of one when I wrote this; hence, please don't refer to the GUI code as a template for a pleasant and sensible Matlab GUIing experience.