MATLAB Answers


Do Matlab structures leak memory?

Asked by Klaus Förger on 18 Jun 2015
Latest activity Commented on by Klaus Förger on 19 Jun 2015
I am currently processing data files that are so large that I can fit them to memory only one per at a time. My problem is that if I use structures for storing the data, I keep getting out of memory errors. I managed reproduce the problem with the following code:
%%This runs ok first time, but gives "Out of memory" error on the second run
mult = 200;
a = zeros(mult * 1000, 1000);
%%The following code seems to leak memory
clear a;
for i = 1:mult
s(i).sa = rand(1000, 1000);
s_copy = s;
I was wondering if anyone can reproduce this problem or is my computer broken? You might need to adjust the 'mult' to a larger value to get the code reserve enough memory to cause the problem.

  1 Comment

I did some further testing, and it seems that I can replicate the problem only if I run the code as the first thing after starting Matlab. Also, if I comment out the line "s_copy = s", the problem does not appear. And after running the commented version, the original one seems to work fine, and all the memory can be released with the command 'clear'.
Therefore, it seems that creating a fake struct to reserve memory and clearing it out before doing any actual work might solve my problem. I need to try this while working to see if this is the case. However, it seems strange that the problem can appear at all.

Sign in to comment.



2 Answers

Answer by B.k Sumedha on 18 Jun 2015

I suggest you to try:
File > Preferences > General > Java Heap Memory.
Then you can increase the amount of memory beyond the default value of 196 MB. Hope this helps.


My default for heap was 384 MB, I tried the allowed maximum that was 954 MB, and restarted Matlab, but that did not not help.
I am not getting the 'OutOfMemoryError: Java heap space' error, but a regular out of memory error. When I start Matlab, the System monitor (from Ubuntu) says that it reserves 538 MB. After running the first section of my code Matlab uses 2.1 GB. If I run the command 'clear' at this point, the reserved memory drops back to 583 MB. Running the second section of the code leaves Matlab again at 2.1 GB, but after it the command 'clear' does not lower the amount of reserved memory.
Do you really need that much of data to be multiplied?
In the real code I use, I am not multiplying anything, instead I have the same amount of motion capture data that I load from files.
The point of my example code is to show that having the same amount of data in an array works, but if it is in a struct it cannot be cleared out of the memory.
I just tried this on a second computer that has more memory, and the same problem happens there too. After running the code once, Matlab used 2.0 GB. Then I run the command 'clear' and Matlab still uses 2.0 GB. After this, when I run the first section again Matlab uses 3.4 GB. That would have been too much for the other computer.

Sign in to comment.

Answer by Philip Borghesani on 18 Jun 2015

This is not a leak and has nothing to do with Java heap. This code is not using the Java heap.
The problem is that you are fragmenting the virtual address space on a 32 bit version of MATLAB. Use the memory command to to view available and largest memory blocks along with how much memory MATLAB is using.
The best solution is to use 64 bit MATLAB.
You might take a look at this Is your memory fragmented


Thanks for the suggestion, but I do not think that having too few bits is the problem here. The command 'computer' gives me 'ans = GLNXA64' so I do have a 64 bit Matlab. The version is R2015a. I am also running a 64 bit Ubuntu 14.10. The fragmentation link seem to be mainly for Windows machines.
I tried the command 'memory', but is replies: 'Function MEMORY is not available on this platform.'
The problem still feels like a fragmentation issue. The best solution should just be to increase the size of your swap partition. (You do have one and it is enabled?) these memory sizes seem small for a 64 bit machine is this a VM? I ran this code hundreds of times on my machine with no visible leak.
I use a regular installation with no Virtual Machines. Having enough memory or swap space of course prevents the 'Out of memory' errors. However, it does not really solve the issue that I can end up having several GBs to be unclearable without restarting Matlab.
I generally try to avoid resorting to swap space as that really slows down everything. The first time I encountered this issue was not an 'Out of memory' error, but the whole computer slowed to almost halt as swapping started.

Sign in to comment.