Matlab slowing down on linprog / parfor memory consumption

4 views (last 30 days)
I am running some simulations using Matlab 2015b and have two problems. I want to run 10 simulations with 10 different random number seeds. Thus, each simulation should take approximately the same amount of time. The first problem is that simulations slow down, so that the first one takes about an hour whereas the fourth one takes 10 hours or so (the 10th one would probably take days or weeks but I used Ctrl+c before that). I ensured this is not due to the sleep mode or anything similar.
I used the profiler to see what the reason may be. I let the profiler run over night, during which only three simulations were completed. The profiler claims the most computationally expensive step is the function optim\private\simplexphasetwo (I imagine this is called by the “linprog” function). The “linprog” uses the simplex algorithm. This usually throws a warning that the option will be removed in the next release, but I want to keep using this algorithm because other ones do not pass some unit tests we have in our code. I deal with these warnings using: pctRunOnAll warning('off','optim:linprog:AlgOptsConflict') pctRunOnAll warning('off','optim:linprog:AlgOptsWillError')
The linprog is in some cases called from within a parfor loop, and sometimes sequentially, depending on the place in the code. The funny thing is, I think it slows down on the sequential parts and not on the parallel parts. The profiler claims that the most expensive line is a “return” on line 409 in the simplexphasetwo. I imagine this has to do with deleting variables used by this function. It is worth noting that I do not see a large increase in the used memory. The total amount of used memory is below 10%. Furthermore, I do not plot anything in my code and I pre-allocate all large arrays that I use, so this cannot be the reason for the slowing down either.
If I click Ctrl + C and re-start the current simulation, the program starts running fast again and I get the identical results. This is the most confusing part. I can improve the runtime by simply starting each simulation individually. However, I want to be able to process many such simulations at night/over the weekend without having to physically click the start button. Can you please suggest how to solve this?
The second problem I have is related to the parfor loop. The parfor is called within some function x. This function x is called within the main function, running in a while-loop over the simulation time. The main function is called from a calling script that has a for loop running over several simulations with different random number seeds. What I saw some time ago is that memory requirements increase significantly during one simulation, to the point where Matlab crashes because it runs out of memory. I assume this is because the data needed for the parfor loop is passed to the 6 workers every time, but somehow not deleted after the parfor loop ends. Since this loop is called maybe 3000 times during one simulation, it eventually eats up the memory. To deal with this, I close and re-start the parallel pool every 100 runs or so. This has successfully solved the memory problem, but seems like a workaround as re-starting the parallel pool also takes time. Is there a better solution?
  1 Comment
Susan Sun
Susan Sun on 8 Feb 2018
I have the same problem as your problem #1 (though I use the 'intlinprog' option and it says nearly all my time is spent in IntlinprogBranchAndCut>IntlinprogBranchAndCut.run). I feel I must be doing something daft, because I'm still learning to do linear programming, but if it's happening to you too... very puzzling!

Sign in to comment.

Answers (0)

Categories

Find more on Parallel Computing Fundamentals in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!