Is Computational time proportional to the matrix size?

4 views (last 30 days)
Now I'm trying to measure calculation time envolving getting euclidean distance between to vectors.
Say:
D = A - B;
U = norm(D);
where A and B are 1-by-N. As N changes from 2 to 19, I expect that the computational time also increases. However, the time is not exactly proportional to N.
Yes, of course, it's very fast and all time values are in the similar range. But, sometimes time with N = 2 is higher than N = 7, 8, 9,... something like that.
Is it because I am running other programs on my PC? or, MATLAB is fast, but jitters in computational time?

Accepted Answer

Jan
Jan on 19 Jun 2013
Edited: Jan on 19 Jun 2013
Your code calls the minus operator, creates a variable (which means the creation of about 100 bytes overhead also), copies the results, calls the function norm, creates another variable and copy the results again.
This means that the actual calculation use a very small part of the total computational time only. Allocating the memory for the created variables will take much more time for such a tiny input and this depends on other processes and the memory management of the operating system. Therefore you cannot draw any conclusions from calculating the norm for 2 or 19 elements. Try to repeat the operation for 20'000 and 190'000 elements a sufficient number of times, e.g. until the total time is about 5 seconds. Then the mean time allows to estimate the relation between processing time and data size.
Note that this relation is a fragile value: Tiny data set match into the 1st, 2nd, or 3rd level cache of the processor, which influences the processing time substantially. Large data need to be swapped to the hard disk as virtual memory, which can be 100 times slower than keeping the data in the RAM.

More Answers (0)

Categories

Find more on Startup and Shutdown in Help Center and File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!