Save "big" object to file fails
30 views (last 30 days)
Show older comments
Hi,
I'm working on a project with OOP. There is an object called "database" containing a "big" cell-array (nested with mixed contents).
In this database, I stored some file contents. Until now, as I had about 2000 files in this database, the file could be stored properly with "save" and creates a 20 MB file. But as I added another 1000 files, the saving process stops after some time and produces a rudimentary 1KB .mat-file (no error or anything else).
I tried the "pack" command but then Matlab crashed. Of course I could post the log here if desired. I'm using Windows XP SP3, Matlab v. 7.5.0 (R2007b) and wanted to save the file on several file systems (fat/ntfs).
Is this a common issue? I couldn't find anything similar out there...
Greetings
0 Comments
Answers (5)
Andrea Gentilini
on 7 May 2012
Try go to File -> Preferences -> General -> MAT-Files -> and click the option MATLAB Verion 7.3 or later. This will allows you to save variables that are in excess of 2 GB.
Jan
on 22 Nov 2011
If Matlab crahs inside the pack command, you have a serious problem in the memory management. Do you use user-defined Mex functions?
BTW. Athough you can create a database using Matlab, dedicated database programs will do this much better.
0 Comments
Vincent
on 24 Nov 2011
2 Comments
Peter O
on 30 Nov 2011
Hi Vincent, I'm getting the same problem here today. Around a 300MB dataset won't save, but the 52MB version _sometimes_ will. R2011a here. I think the issue, for me, is that we have 7MB 'profile' spaces on the network for program temporary work and it's hitting the wall. I'll let you know if I find anything.
Martin Kahn
on 1 Jul 2018
Hi guys,
Given that this question still gets some views, I just had an issue that sounds very similar (with Matlab 2018a and Windows 10): When trying to save with "save('filename.mat','myFile')" I just got a 1KB file. I don't really know the details of why but this fixed it: "save('filename.mat','myFile','-v7.3')". I guess this is what Andrea suggested? Sorry if it's not helpful...
1 Comment
Riccardo Scorretti
on 23 Sep 2021
Edited: Riccardo Scorretti
on 23 Sep 2021
Hi there.
Unfortunately I'm experiencing the same problem (Matlab 2020b, Linux Fedora F34). As it can be observed in the picture hereafter, as soon as serialization is triggered the amount of used memory nearly doubles:
It looks like if Matlab makes a temporary copy of data which have to be saved (with option -v 7.3 of course), and in some circumstances this ends up in an out of memory error.
In my case, I was trying to save the whole workspace, which contains many huge variables. I suggest to overcome the problem by saving separately each huge variable in different files, so as to lower the peak temporary memory usage, which is apparently required to serialize data.
See Also
Categories
Find more on Whos in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!