Asked by Nicholas Dinsmore
on 11 Apr 2013

I have been trying to do a major re-factoring of my code and thought I would use the the package system (ie. +folders) to organize things better.

After noticing the slow down of my application. I did some benchmarking and noticed that there was a large difference in run time when using the same function in a package and not. To give you a sense of the impact, using a package increase the run time of one of my functions 50%, to put that in perspective that function is 100 lines of code, uses 48 variables, does 72 variable assignments, 76 additions, 36 multiplications, and 9 divisions (It is an incredibly fast algorithm to calculate the basis matrix for a spline of degree 4, that I am very proud of). So that is a lot of computations.

After doing to digging it appears that the package system is implemented as a series of objects/classes (which make sense). Therefore I assume that the slowdown because of that. So I have the following questions/observations.

1.) I am using 2012b, is there any improvements or better integration of the package system on the horizon, such that if I take the hit now the performance impact will be negligible in future.

2.)The performance impact seems to be the same regaurdless of package depth, but has more to do with the number of items in that package(ie more files = more impact).

3.) Use of a function handle to one of those packaged function seems to take that package reference penalty every time it is called instead of just at the creation of the function handle.

Please let me know if you have any thoughts about what I might be doing wrong.

Answer by Sean de Wolski
on 18 Apr 2013

Edited by Sean de Wolski
on 18 Apr 2013

There is unfortunately a bit more overhead in the function call when calling packages. Here is the timing I did:

With this function both in and not in a package (+foopack):

function y = foo(x,a,b) % I create awesome lines! %

y = a.*x+b;

end

And this timing function:

function timeit %Time foo v. foopack.foo calls %

% %SCd - 735262 %

%Some values: [t1, t2] = deal(0); a = 1; b = 2; x = 3;

%Sum their times over 1000 function calls: for ii = 1:1000 tic y = foo(x,b,a); t1 = t1+toc; tic yp = foopack.foo(x,b,a); t2 = t2+toc; end

%Display results: fprintf(1,'\nfoo regular: %fs\nfoo package: %fs\n',t1,t2); fprintf(2,'\nSlowdown of Package: %f\n\n', t2./t1);

end

It is my understanding that this is pretty much the worst case scenario since the overhead controls over the computation time. I really like packages and use them a fair amount. But for speed critical applications, where a function will be called a lot of times, it might pay to pull those computations outside. It's also important to realize that even though it's slower, as far as total time is concerned, it's still pretty quick.

Nicholas Dinsmore
on 19 Apr 2013

Sean,

That begs the question is the a commitment within Mathworks to reduce that overhead? I am trying to figure out whether that performance hit is short term thing or if in future versions it should be reduced. I can think of many ways you could improve that performance just from my own work make large OOP systems in Matlab and then running them through an ODE solve(ie where speed is important).

Related Content

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn moreOpportunities for recent engineering grads.

Apply Today
## 2 Comments

## per isakson (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/71633#comment_142711

I fail to reproduce your results with R2012a 64bit on Windows 7. With package I see only a very small penalty. Could you provide an example code.

## Nicholas Dinsmore (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/71633#comment_144119

I created dumbfunc1-dumbfunc20 which or empty functions and smartfunc1 which is a wrapper around nchoosek(100,10) which has a lot of multiplications

The test code I ran is:

my results are: