File Exchange

image thumbnail

Support Vector Machine

version 1.0.0.0 (204 KB) by Bhartendu
SVM (Linearly Seperable Data) using linear Kernel with Gradient ascent

63 Downloads

Updated 28 May 2017

View License

Refer: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods by Nello Cristianini and John Shawe-Taylor]
In this demo: training or cross-validation of a support vector machine (SVM) model for two-class (binary) classification on a low dimensional data set.

The training algorithm only depend on the data through dot products in H, i.e. on functions of the form Φ(x_i)·Φ(x_j). Now if there were a “kernel function” K such that
K(x_i,x_j) = Φ(x_i)·Φ(x_j),
we would only need to use K in the training algorithm, and would never need to explicitly even know what Φ is. One example is radial basis functions (RBF) or gaussian kernels where, H is infinite dimensional, so it would not be very easy to work with Φ explicitly.

Training the model requires the choice of:
• the kernel function, that determines the shape of the decision surface
• parameters in the kernel function (eg: for gaussian kernel:variance of the Gaussian, for polynomial kernel: degree of the polynomial)
• the regularization parameter λ.

Related Examples:
1. AdaBoost
https://in.mathworks.com/matlabcentral/fileexchange/63156-adaboost

2. SVM using various kernels
https://in.mathworks.com/matlabcentral/fileexchange/63033-svm-using-various-kernels

3. SVM for nonlinear classification
https://in.mathworks.com/matlabcentral/fileexchange/63024-svm-for-nonlinear-classification

4. SMO
https://in.mathworks.com/matlabcentral/fileexchange/63100-smo--sequential-minimal-optimization-

Comments and Ratings (18)

Bhartendu

@Matthys Holdout is a method of CV (Cross Validation) partition.

Can someone tell me what the function of the holdout function is?

ammar noori

Dear Bhartendu
do you have the documentation that describe your work....your input is highly appreciated

BR

Bhartendu

Thanks @Yeonjong, the two errors are probably due to mismatch of MatLab versions.

Yeonjong

I run into two errors while I run this code.
For me, the following changes work very well.

1. In grad-Ascend,
w1=(alp_old.*Y).*X; ==> w1=(alp_old.*Y)'*X;
w2=(alpha.*Y).*X; ==> w2=(alpha.*Y)'*X;

2. Plotting
------------------------------------------------
syms x
fn=vpa((-bias-W(1)*x)/W(2),4);
fplot(fn,'Linewidth',2);
fn1=vpa((1-bias-W(1)*x)/W(2),4);
fplot(fn1,'--');
fn2=vpa((-1-bias-W(1)*x)/W(2),4);
fplot(fn2,'--');
------------------------------------------------
I changed to the following and it works for me.
------------------------------------------------
xItv = linspace(-5,5,1000);
fn = @(x) vpa((-bias-W(1)*x)/W(2),4);
plot(xItv,fn(xItv),'Linewidth',2);
fn1 = @(x) vpa((1-bias-W(1)*x)/W(2),4);
plot(xItv,fn1(xItv),'--');
fn2 = @(x) vpa((-1-bias-W(1)*x)/W(2),4);
plot(xItv,fn2(xItv),'--');

John Martin

Bhartendu

What is the reason for your poor rating nhat truong??

nhat truong

MATLAB Release Compatibility
Created with R2015a
Compatible with any release
Platform Compatibility
Windows macOS Linux