I have been trying to make a code that will save 1000's of excel files containing data analysed from csv files. I have a base excel file I write the data to, run several macros to make charts etc and then I save the file. I then make a copy of the file (which I store away) and I clear the data in my base excel file by running a macro called 'cleanup' so I'm ready to repeat these steps with the next set of data. I have done it this way to avoid opening and closing connections all over the place in order to make things run faster. However there appears to be a memory leak and although things start off fast the code eventually slows to a crawl as excel starts to hog well over a GB of memory. Is there a way to stop the memory build up in excel - there should only ever be the one connection so I don't understand how it uses an increasing amount of memory. (edit: the individual files made are fairly small ~ 1MB)
The initial connection is set up as:
if true e = actxserver ('Excel.Application'); set(e,'DisplayAlerts',0); set(e, 'Visible', 0); invoke(e.Workbooks,'Open',file_template); end
and for every file the following is called:
if true function excelwrite( file , data , macro , macroName , e) % write data to the excel file [m,n] = size(data); range = ''; %calcrange is borrowed from 'xlswrite'. range = calcrange(range,m,n); Select(Range(e,sprintf('%s',range))); set(e.selection,'Value',data); % if macro is set to 1 then run the macros in the array: 'macroName' if macro == 1 for i = 1:length(macroName(:,1)) e.Run(macroName(i,:)); end end %save, make a copy, and delete data from active workbook. e.ActiveWorkbook.Save copyfile(e.activeWorkbook.fullname,file_out); e.Run('cleanup'); end
No products are associated with this question.