Use of penalty multiplier C (boxconstraint) in svmtrain

1 view (last 30 days)
I looked up the code of svmtrain. If the algorithm is QP, it uses
[alpha, ~, exitflag, output] = quadprog(H,-ones(nPoints,1),[],[],...
groupIndex',0,zeros(nPoints,1), Inf *ones(nPoints,1),...
X0, qp_opts)
So this means that there is no upper bound on alphas.
It takes into account the boxconstraint in this fashion:
kx = feval(kfun,training,training,kfunargs{:});
% ensure function is symmetric
kx = (kx+kx')/2 + diag(1./boxconstraint);
H =((groupIndex * groupIndex').*kx);
I am unable to understand how does this enforce the box constraints:
0 < alphai < C
Thanks

Accepted Answer

Ilya
Ilya on 3 Dec 2012
The magnitude of the margin slack vector can be shrunk using a 1-norm or 2-norm penalty. The 1-norm version seems more popular, and it gives rise to the dual formulation in which the alphas are capped by C. The QP solution in the svmtrain function however uses the 2-norm penalty. In this case, the dual problem has no upper bound on alpha and a 1/C term is added to the diagonal of the Hessian. I would agree that referring to C as "box constraint" in this case is somewhat confusing.
  1 Comment
Rupesh Gupta
Rupesh Gupta on 3 Dec 2012
Thank you for your answer. Just want to ask one clarification question. Is there any specific technical reason why this is done? Or put another way, will I see some problems if I modify the code to keep the hessian matrix as it is, and include the box constraint as upper bound in quadprog?
Thanks again

Sign in to comment.

More Answers (1)

Ilya
Ilya on 3 Dec 2012
Off hand I cannot think of a reason why you couldn't use quadprog to solve the 1-norm problem. I don't know why the 2-norm solution was chosen for svmtrain. The svmtrain implementation is relatvely old. The fast 'interior-point-convex' algorithm for quadprog was introduced in 11a, years after svmtrain was coded. Whatever reasons motivated this choice for svmtrain may not hold anymore.

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!