Save "big" object to file fails

30 views (last 30 days)
Vincent
Vincent on 22 Nov 2011
Edited: Riccardo Scorretti on 23 Sep 2021
Hi,
I'm working on a project with OOP. There is an object called "database" containing a "big" cell-array (nested with mixed contents).
In this database, I stored some file contents. Until now, as I had about 2000 files in this database, the file could be stored properly with "save" and creates a 20 MB file. But as I added another 1000 files, the saving process stops after some time and produces a rudimentary 1KB .mat-file (no error or anything else).
I tried the "pack" command but then Matlab crashed. Of course I could post the log here if desired. I'm using Windows XP SP3, Matlab v. 7.5.0 (R2007b) and wanted to save the file on several file systems (fat/ntfs).
Is this a common issue? I couldn't find anything similar out there...
Greetings

Answers (5)

Andrea Gentilini
Andrea Gentilini on 7 May 2012
Try go to File -> Preferences -> General -> MAT-Files -> and click the option MATLAB Verion 7.3 or later. This will allows you to save variables that are in excess of 2 GB.
  1 Comment
Vincent
Vincent on 10 May 2012
I am sorry Andrea, but this does not help. Just for clearification: Files > 20 MB can be stored as long as they do not contain an object.

Sign in to comment.


Jan
Jan on 22 Nov 2011
If Matlab crahs inside the pack command, you have a serious problem in the memory management. Do you use user-defined Mex functions?
BTW. Athough you can create a database using Matlab, dedicated database programs will do this much better.

Vincent
Vincent on 22 Nov 2011
No I don't use any user-defined Mex functions until now. I only have two instances of Matlab running at the same time, but I can't imagine that this is a problem...
And yup, I know of Access, SQL and so on. I'd really like to use them but the people around me prefer Matlab ;) Thanks anyway for your hint

Vincent
Vincent on 24 Nov 2011
Hi there again, I tried to run the same thing on a newer Matlab version (2011b) and got the following error message: Out of Memory error during serialization of subsystem data (or similar)
Does this help anyone finding a solution how I may save my object?
  2 Comments
Peter O
Peter O on 30 Nov 2011
Hi Vincent, I'm getting the same problem here today. Around a 300MB dataset won't save, but the 52MB version _sometimes_ will. R2011a here. I think the issue, for me, is that we have 7MB 'profile' spaces on the network for program temporary work and it's hitting the wall. I'll let you know if I find anything.
Peter O
Peter O on 30 Nov 2011
Oh, and Pack doesn't crash, but gives the same out of memory error

Sign in to comment.


Martin Kahn
Martin Kahn on 1 Jul 2018
Hi guys,
Given that this question still gets some views, I just had an issue that sounds very similar (with Matlab 2018a and Windows 10): When trying to save with "save('filename.mat','myFile')" I just got a 1KB file. I don't really know the details of why but this fixed it: "save('filename.mat','myFile','-v7.3')". I guess this is what Andrea suggested? Sorry if it's not helpful...
  1 Comment
Riccardo Scorretti
Riccardo Scorretti on 23 Sep 2021
Edited: Riccardo Scorretti on 23 Sep 2021
Hi there.
Unfortunately I'm experiencing the same problem (Matlab 2020b, Linux Fedora F34). As it can be observed in the picture hereafter, as soon as serialization is triggered the amount of used memory nearly doubles:
It looks like if Matlab makes a temporary copy of data which have to be saved (with option -v 7.3 of course), and in some circumstances this ends up in an out of memory error.
In my case, I was trying to save the whole workspace, which contains many huge variables. I suggest to overcome the problem by saving separately each huge variable in different files, so as to lower the peak temporary memory usage, which is apparently required to serialize data.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!