View License

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Highlights from
SVM Demo

4.6 | 8 ratings Rate this file 207 Downloads (last 30 days) File Size: 24.1 KB File ID: #28302 Version: 1.5
image thumbnail

SVM Demo



26 Jul 2010 (Updated )

An interactive demo of how an SVM works, with comparison to a perceptron

| Watch this File

File Information

SVMs are a bit tricky. In this case, we show a linear SVM and illustrate its behaviour on some 2D data. This should be great for getting to grips with maximising geometric margins, support vectors, and the optimisation involved in computing an optimal separating hyperplane.

Data can be generated randomly (uniformly or from separate gaussians) over the 2D space, and an SVM or perceptron can be trained to find a separating line. Data points can be dragged around with the mouse, and the model (perceptron or SVM) will retrain in real-time as the point is dragged (observe that dragging non-support vector points will not change the SVM decision boundary). The weights/bias terms can also be adjusted by dragging (either the weight vector arrow to change weights, or the decision boundary to change the bias); it should be clear that no configuration of the weights will give a larger minimum margin than that computed by an SVM. The weights/bias can also be randomised to illustrate the effect of random initial weights on the converged solution to the perceptron algorithm.

The program requires some implementation of a QP solver or SVM algorithm. Therefore, you will need to have one of:
- The bioinformatics toolbox, which includes an svmtrain function
- The optimization toolbox, which includes a quadprog function
- The third-party library "libsvm", which includes an svmtrain function.
If one or more of these is in the matlab path, the program should just work. To add a custom SVM solution, refer to the code commentary in LinearClassifier.
Code is extensively commented and documented. There are a number of outrageously obfuscated uses of arrayfun that may be of interest to people who enjoy incomprehensible code. The user interface code doesn't follow the preferred design pattern for Matlab GUI code because I didn't know of one when I wrote this; hence, please don't refer to the GUI code as a template for a pleasant and sensible Matlab GUIing experience.

MATLAB release MATLAB 7.5 (R2007b)
Other requirements Requires any one of: - Bioinformatics Toolbox - Optimization Toolbox - libsvm (available from )
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (12)
08 Jun 2016 Sumit Patil

13 Mar 2016 Ankur Rai

Dear Richard,
I have to classify a medical image in ROI(Region of Interest) and RONI(Region of Non Interest) for image watermarking, but able to create predictor data or features for SVM.
Please help me sir, i need your help.

18 Jan 2016 Nur Sakinah

how to use this demo sir? I need a help and i am using matllab R2014b.

This is my email
It will be helpful if you can help me sir. Thankyou.

Comment only
13 Jul 2015 Chunghee Kim

Hello I was wondering why it doesn't work well with matlab 2015? I have the academic license. Does that make a difference?

Comment only
16 Jul 2014 SASTRA University

23 May 2013 godwin tgn

19 Feb 2012 Tulips

Tulips (view profile)

19 Nov 2011 Richard Stapenhurst

Hi Honza,
if you are referring to the case where data is not linearly separable, then it is true that the computation of the boundary and support vectors may seem strange.

Typically, we would construct an SVM with some tolerance for misclassified data; for example, we could compute a 'soft margin' (See Vapnik and Cortes, 1995), which optimises a trade-off between the maximum margin decision boundary and a small penalty for misclassification. In my program, I have set the misclassification penalty to be very large - since this can affect the position of the decision boundary even when data is linearly separable. I do this so that you can see the optimal separating hyperplane.

Obviously this is not representative of how you would solve a real non linearly separable problem - if you suspect that data is noisy, it would be appropriate to change the 'C' parameter in a soft-margin SVM. Alternatively, if you suspect that data is complex and non-linear, it may be better to use a non-linear kernel (e.g. rbf).

Regardless of where data is linear and/or separable, you should find that some data points that are correctly classified will be support vectors. The support vectors correspond to the data points that actually define where the decision boundary is. They comprise any misclassified datapoints (which will correspond to penalty terms in the soft-margin problem), and all the correctly classified datapoints which lie exactly the minimum distance away from the margin.

I hope this answers your question.

Comment only
18 Nov 2011 Honza

Honza (view profile)

Thanks for the demo. However, I'm not sure that the margin is estimated correctly in the non-linear case. Is it allright that correctly classified points (on the right side of the margin) act as support vectors? Thanks for clarification

Comment only
13 Oct 2011 Alex Frid

09 Oct 2011 Mr Smart

19 Apr 2011 Samad

Samad (view profile)

29 Jul 2010 1.1

Added a link to the libSVM download page in the requirements section.

17 Aug 2010 1.2

Modified the auto-detection of SVM algorithm for additional easiness, and made the initial position of the window be decided based on screen size.

23 Sep 2010 1.3

Updated for Matlab R2010a. Soft-margin constraints are now large-but-not-too-large so nobody gets upset when data is non-separable. Line smoothing/transparency removed. Bioinformatics svm training changed from LS to SMO.

30 Sep 2010 1.4

Fixed a bug where the 'train' option was sometimes disabled inappropriately. Reduced the delay between perceptron epochs.

31 Mar 2011 1.5

Added a graphics mode menu and an SVM algorithm menu.

Contact us