Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Memory Leak loading many *.mat files.

Subject: Memory Leak loading many *.mat files.

From: amplitude@gmail.com

Date: 22 Apr, 2009 20:51:37

Message: 1 of 6

Hi,

I'm currently trying to track down the source of an apparent memory
leak on some MATLAB code. The MATLAB 2008a (on Linux) process doesn't
increase its memory usage when observed from top, but the total memory
used on the system seems to increase until the script is stopped.
Exiting MATLAB doesn't return all the memory that would appear to have
been used during the execution of the MATLAB process.

I'm certainly doing a lot of the sort of things that might be expected
to cause a memory leak (loading files and clearing variables). When I
originally wrote what I am pretty sure is the offending code (snippet
at the end of this message) I explicitly cleared the variable to try
and prevent such exciting events occuring, but it doesn't seem to
work. To pre-empt, all arrays involved are preallocated to comfortable
sizes - they're not being filled and then growing wildly.

The data variable contains about 14MB of data in the case of each
<number>.mat file so it's not huge, but there are a fair few of them
(400). data(n) is a structure containing two arrays. I realise it's
bad form to have a structure of arrays, but that's how the cookie has
crumbled here.

If there's no obvious solution that I've missed, how might I go about
looking for the leak? As I mentioned, I'm on Linux so I'm not able to
feature('memstats').

Thanks,

Alex Rea


========================
for fileid=1:400

eval(strcat('load ./',int2str(fileid),'.mat data'))

for j = 1:size(data,2)
#Analyse each element of data and write result to array
end

clear data

end
=========================

Subject: Memory Leak loading many *.mat files.

From: chairmanK

Date: 26 Mar, 2010 21:43:04

Message: 2 of 6

I have a similar problem. I run a script that sequentially loads, modifies, saves, and clears large struct arrays from *.mat files that are ~100 MB each. MATLAB steadily increases its memory use until the computer hangs, even though total size of the variables in the workspace remains bounded. I would like to know how to diagnose the problem.

Subject: Memory Leak loading many *.mat files.

From: Eric

Date: 10 May, 2010 21:41:05

Message: 3 of 6

"chairmanK " <chairmanK_atgmaildotcom@foobar.org> wrote in message <hoj9p8$5o2$1@fred.mathworks.com>...
> I have a similar problem.

Another Bump! for this problem.

I read a large binary file, do some signal processing, save plots as .png and .fig, and save the results in a .mat for later inspection.

As I loop over files, the Matlab memory of the process (as shown by top) increases apparently without bound, until exceptions start to occur.

It sure seems like a memory leak. I have tried to verify that file handles are all closed, etc., and they appear to be. Are there any tools with which to diagnose such a problem?

My version information is below.

Eric

>> ver
-------------------------------------------------------------------------------------
MATLAB Version 7.9.0.529 (R2009b)
MATLAB License Number: 541556
Operating System: Linux 2.6.24-27-generic #1 SMP Fri Mar 12 01:10:31 UTC 2010 i686
Java VM Version: Java 1.6.0_12-b04 with Sun Microsystems Inc. Java HotSpot(TM) Client VM mixed mode
-------------------------------------------------------------------------------------
MATLAB Version 7.9 (R2009b)
Simulink Version 7.4 (R2009b)
Communications Toolbox Version 4.4 (R2009b)
Fixed-Point Toolbox Version 3.0 (R2009b)
Signal Processing Blockset Version 6.10 (R2009b)
Signal Processing Toolbox Version 6.12 (R2009b)
Simulink Fixed Point

Subject: Memory Leak loading many *.mat files.

From: omegayen

Date: 11 May, 2010 15:54:05

Message: 4 of 6

"Eric" <eric.nospam@brnphoenix.nospam.com> wrote in message <hs9uhh$fhf$1@fred.mathworks.com>...
> "chairmanK " <chairmanK_atgmaildotcom@foobar.org> wrote in message <hoj9p8$5o2$1@fred.mathworks.com>...
> > I have a similar problem.
>
> Another Bump! for this problem.
>
> I read a large binary file, do some signal processing, save plots as .png and .fig, and save the results in a .mat for later inspection.
>
> As I loop over files, the Matlab memory of the process (as shown by top) increases apparently without bound, until exceptions start to occur.
>
> It sure seems like a memory leak. I have tried to verify that file handles are all closed, etc., and they appear to be. Are there any tools with which to diagnose such a problem?
>
> My version information is below.
>
> Eric
>
> >> ver
> -------------------------------------------------------------------------------------
> MATLAB Version 7.9.0.529 (R2009b)
> MATLAB License Number: 541556
> Operating System: Linux 2.6.24-27-generic #1 SMP Fri Mar 12 01:10:31 UTC 2010 i686
> Java VM Version: Java 1.6.0_12-b04 with Sun Microsystems Inc. Java HotSpot(TM) Client VM mixed mode
> -------------------------------------------------------------------------------------
> MATLAB Version 7.9 (R2009b)
> Simulink Version 7.4 (R2009b)
> Communications Toolbox Version 4.4 (R2009b)
> Fixed-Point Toolbox Version 3.0 (R2009b)
> Signal Processing Blockset Version 6.10 (R2009b)
> Signal Processing Toolbox Version 6.12 (R2009b)
> Simulink Fixed Point

The only way I would know how to diagnose this problem would be to use feature('memstats') as the original poster indicated, but you cant use that on linux....

you could try loading and saving .txt files instead of .mat files to see if there is any difference.....

to save and load to .txt you can refer to my post here http://www.mathworks.com/matlabcentral/newsreader/view_thread/274020#739916

Subject: Memory Leak loading many *.mat files.

From: Massimiliano Salsi

Date: 4 Jan, 2011 17:10:22

Message: 5 of 6

"omegayen " <omegayen@ameritech.net> wrote in message <hsbuit$6iu$1@fred.mathworks.com>...
> "Eric" <eric.nospam@brnphoenix.nospam.com> wrote in message <hs9uhh$fhf$1@fred.mathworks.com>...
> > "chairmanK " <chairmanK_atgmaildotcom@foobar.org> wrote in message <hoj9p8$5o2$1@fred.mathworks.com>...
> > > I have a similar problem.
> >
> > Another Bump! for this problem.
> >
> > I read a large binary file, do some signal processing, save plots as .png and .fig, and save the results in a .mat for later inspection.
> >
> > As I loop over files, the Matlab memory of the process (as shown by top) increases apparently without bound, until exceptions start to occur.
> >
> > It sure seems like a memory leak. I have tried to verify that file handles are all closed, etc., and they appear to be. Are there any tools with which to diagnose such a problem?
> >
> > My version information is below.
> >
> > Eric
> >
> > >> ver
> > -------------------------------------------------------------------------------------
> > MATLAB Version 7.9.0.529 (R2009b)
> > MATLAB License Number: 541556
> > Operating System: Linux 2.6.24-27-generic #1 SMP Fri Mar 12 01:10:31 UTC 2010 i686
> > Java VM Version: Java 1.6.0_12-b04 with Sun Microsystems Inc. Java HotSpot(TM) Client VM mixed mode
> > -------------------------------------------------------------------------------------
> > MATLAB Version 7.9 (R2009b)
> > Simulink Version 7.4 (R2009b)
> > Communications Toolbox Version 4.4 (R2009b)
> > Fixed-Point Toolbox Version 3.0 (R2009b)
> > Signal Processing Blockset Version 6.10 (R2009b)
> > Signal Processing Toolbox Version 6.12 (R2009b)
> > Simulink Fixed Point
>
> The only way I would know how to diagnose this problem would be to use feature('memstats') as the original poster indicated, but you cant use that on linux....
>
> you could try loading and saving .txt files instead of .mat files to see if there is any difference.....
>
> to save and load to .txt you can refer to my post here http://www.mathworks.com/matlabcentral/newsreader/view_thread/274020#739916


Hello everybody, I have a similar problem.
I load binary files and proces them and store some results on either .txt or .mat file at the end of the process.
My Matlab process memory usage keep increasing with the number of processed files.
And clear, clear all, clear functions, pack, close all ,etc.... cannot reduce the memory usage so I end up with a >1GB Matlab when it is "empty".
Since we plan to process a larger number of files with many parallel MATLAB on a multi-core system this leakage will cause out-of-memory kill everywhere.
We have observed this feature on Matlab R2008b and R2009a on Linux 32bit and on Linux 64bit (with Matlab 64bit).
In particular the leakage tends to be larger on 64bit machines.
This is a big issue for us. Help needed !

Massimiliano

Subject: Memory Leak loading many *.mat files.

From: Daniel Garcia

Date: 18 Jan, 2011 05:44:05

Message: 6 of 6

Same problem here, and in my case I'm not even reading anything.
Just trying to save my generated data on a loop.

A simplified version is:
for i = 1:n,
- Generate output
- save output to file
- clear output
end

At every iteration the output file is cleared so memory use should remain constant.
As a matter of fact, that's what happens if I remove the "save" command. Yet for some reason if I pretend to "save" my work Matlab doesn't like it, and all the memory that is taken by the "save" command will never be given back until I restart Matlab.

Needless to say, my loop ends with an "out of memory" problem no matter what I do.
I've tried everything suggested on as many forums I could find, and included some stuff of my own like taking the "save" command to a separate routine, then use "clear all, etc."

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us