Reducing computation time for large data set
Show older comments
I'm looping through a large data set (1843200 data points and a total data size of ~30MB). Basically, I am using a for loop to step through each data point, and then create a temporary vector that is a subset of the original data vector. I then use the average and RMS functions to find a value which I add to a third array. The code currently takes about 60 seconds to run, which is not awful by itself, but it does present problems when I have to run this code of several thousand data sets. Below is the piece of code in question. Is there any way I can speed up the process, e.g. looping through differently or appending to arrays, etc?
function [rmsVector,timeVector] = rms_calc(colVector, timeSpan, sampleFreq)
%NOTE: timeSpan must be in seconds and sampleFreq must be in samples/sec
vecLen = length(colVector); %vector length
timeLen = sampleFreq*timeSpan; %converts time to indices
actualLen = vecLen-timeLen; %actual length of output data.
divRem = rem(vecLen,timeLen); %find the remainder
if divRem ~= 0
fprintf('The time of the data is not evenly divisible by the chosen time span. This may cause inaccuracies in the RMS calculations')
end
rmsArray = zeros(actualLen,1);
timeArray = zeros(actualLen,1);
for i = 1:actualLen
colStart = i;
colEnd = colStart+timeLen-1;
tempArray = colVector(colStart:colEnd);
meanVal = mean(tempArray);
rmsArray(i) = rms(tempArray-meanVal);
timeArray(i) = i/sampleFreq+timeSpan;
if rem(i,20000.0)==0
fprintf('%3.0f%%\n',i/vecLen*100); %I'm just using this to keep track of the code and which iteration
end
end
rmsVector = rmsArray;
timeVector = timeArray;
1 Comment
Answers (0)
Categories
Find more on Surrogate Optimization in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!