SVM Quadratic programming problem
6 views (last 30 days)
Show older comments
Hellow, I just find the link about SVM real implementation very useful: 1.http://www.robots.ox.ac.uk/~az/lectures/ml/matlab2.pdf 2.http://cbio.ensmp.fr/~thocking/mines-course/2011-04-01-svm/svm-qp.pdf
However, I tried to write the script and found the performance was very low when kernel function is involved in. I also tried my best to find what's the problem but cant find. So could you plz help me?? Thanks a lot!
Tested data and all scripts are in the attachment!!!
% SVM function--------------------------------- % C is the penalty index; % epsilo is the measurement of uncertainty. The less this value for the % traning data is, the better is the certainty of this training data. function [b,w,epsilo] = C_svm(training_data, training_result,C, kernel_choice,varargin) if nargin<4 && nargin>=3 kernel_choice = 'no'; elseif nargin<3 kernel_choice = 'no'; C = 1; end
switch kernel_choice case 'linear' for i = 1: size(training_data,1) for j = 1:size(training_data,1) % kernel_output(i,j) = exp(-(training_data(i,:)-training_data(j,:))*(training_data(i,:)-training_data(j,:))'/2); kernel_output(i,j) = training_data(i,:)*training_data(j,:)'; end end end
switch kernel_choice case 'linear' [m n] = size(kernel_output); case 'no' [m n] = size(training_data); end if strcmp(kernel_choice,'no') w = zeros(1+n+m,1); % w = [b;w;epsilo]; H = zeros(1+n+m,1+n+m); for i = 2:n+1 % corresponding to size of coefficents w H(i,i) = 1; end f = [zeros(1,1+n),ones(1,m)*C]; f = f'; A_11 = zeros(m,1+n); A_12 = eye(m); for i = 1: m V(i,:) = training_data(i,:)* training_result(i); end A_21 = [training_result,V]; A_22 = eye(m); A = -[A_11,A_12;A_21,A_22]; b = -[zeros(m,1),ones(m,1)]';
% Minimise 1/2*x'*H*x + f'*x, subject to the restrictions A*x >= b
w = quadprog(H,f,A,b);
b = w(1);
epsilo = w(n+1:end);
w = w(2:n+1);
else
w = zeros(1+n+m,1);
H = zeros(1+n+m,1+n+m);
for i = 2:n+1 % corresponding to size of coefficents w
for j = 2:n+1
H(i,j) = kernel_output(i-1,j-1);
end
end
f = [zeros(1,1+n),ones(1,m)*C];
f = f';
for i = 1: m
V(i,:) = kernel_output(i,:)* training_result(i); % used for A_21
end
A_11 = zeros(m,1+n);
A_12 = eye(m);
A_21 = [training_result,V];
A_22 = eye(m);
A = -[A_11,A_12;A_21,A_22];
b = -[zeros(m,1),ones(m,1)]';
% Minimise 1/2*x'*H*x + f'*x, subject to the restrictions A*x >= b
w = quadprog(H,f,A,b);
b = w(1);
epsilo = w(n+1:end);
w = w(2:n+1);
end
%test------------------------------------------------------ clc; clear all; C = 0.5; load('data.mat'); X = A; y = X(:,4);% 2 class X = X(:,1:3);
shift = - mean(X); stdVals = std(X); scaleFactor = 1./stdVals; % leave zero-variance data unscaled: scaleFactor(~isfinite(scaleFactor)) = 1; % shift and scale columns of data matrix: for c = 1:size(X, 2) X(:,c) = scaleFactor(c) * (X(:,c) + X(c)); end
y(1:50) = -1; [b,w,epsilo] = C_svm(X, y,C,'linear');
for i = 1: size(X,1) for j = 1:size(X,1) kernel_output(i,j) = X(i,:)*X(j,:)'; end end
y_test_linear = sign(kernel_output*w+b);
error_linear = y-y_test_linear; errorrate_linear = length(error_linear(error_linear~=0))/size(X,1);
[b,w,epsilo] = C_svm(X, y,C,'no'); y_test_no_kernel = sign(X*w+b);
error_no_kernel = y-y_test_no_kernel; errorrate_no_kernel = length(error_no_kernel(error_no_kernel~=0))/size(X,1);
0 Comments
Answers (1)
najla slama
on 26 Apr 2015
hi, can any one help me with RNN example of a matlab code please(one layer recurrent neural network if you can and not a matlab toolbox) where i can know how it work and how i can use it for solving quadratiq problems.thanks
0 Comments
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!