How can I get svmtrain to use multiple cores when run on a cloud?

2 views (last 30 days)
My lab has set up a small cloud on which I was hoping to train SVMs. On my regular desktop, svmtrain() uses multiple cores. However, when I'm using a virtual machine on the clould, svmtrain() only uses 1 of the 8 cores. inv() uses all 8 cores, so it doesn't seem like this virtual machine is inherently incapable of using multiple cores.
*Edit: Here is some example code:
nTraining = 6000;
L = 3780;
w0 = [0.5; 0.5; zeros(L-2,1)];
b0 = -0.5;
trainingData = rand(nTraining,L);
trainingClass = sign(trainingData*w0 + b0);
opts = optimset('MaxIter',1e6);
svmStruct = svmtrain(trainingData,trainingClass,'KERNEL_FUNCTION','linear','options',opts);
Using the profiler, I found that the most time consuming step by far is MATLAB/R2013a/toolbox/stats/stats/private/linear_kernel.m, which simply does
K = (u*v');
Doing something similar in the command window
u = rand(5e3,3780);
v = rand(5e3,3780);
K = (u*v');
I found that it is able to use all 8 cores. Is there some flag in svmtrain.m that tells it to only use one core?

Accepted Answer

Tanmay
Tanmay on 20 May 2014
Edited: Tanmay on 20 May 2014
It turned out to be a problem with the way we set up the virtual machine, rather than with MATLAB. We use Eucalyptus to manage the cloud and libvirt generates the virtual machine. In libvirt.xsl theres a way to change the model of the cpu. It defaults to the cpu with the most basic features. We changed the model to SandyBridge and it seems to allow svmtrain to use many more cores.
As a side note, I also found that increasing the kernelcachelimit on svmtrain significantly improved our training times.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!