MATLAB Answers

Using load/save on a network drive at full speed.

27 views (last 30 days)
Andrew Metcalf
Andrew Metcalf on 9 Sep 2016
Answered: Douglas on 3 Apr 2017
I have a network drive on which I store my data and .mat files. Using Windows Explorer, I can move and copy files at nearly the full 1 Gbps connection, typically getting about 950 Mbps throughput. In Matlab R2016a, when I navigate to this same folder and double-click a .mat file to load it, essentially running a "load('filename.mat')" command, I can see with Task Manager that I am only getting about 130 Mbps throughput. Does anybody have any ideas for how to get Matlab to utilize the full speed of my connection?
Note that for some files, I can actually get the full speed - I'm trying to figure out what might be different about these files (i.e. compressed v uncompressed, but I can't find any consistent correlations yet). Another thought I had was that maybe Windows is limiting network access for the Matlab process, but when I look at advanced network stats with a program called NetBalancer, the file loading from the network drive is actually routed through the "System" process and is not directly accessed by Matlab.
Thanks for any help and ideas.


James Tursa
James Tursa on 9 Sep 2016
What about a 2-step process ... copy file to local drive (e.g., C:) and then load? Is that combo faster overall?
Andrew Metcalf
Andrew Metcalf on 13 Sep 2016
Ok, I've done some more investigation - including this suggestion of copying to local drive then loading. Overall, there is not much increase in speed, if any at all (I didn't do a rigorous tic-toc, but am going by observation).
My current thinking is that the cause for slowness is compression of the v7.3 .mat file format. Apparently, uncompression is a single-core process. Because my network access is fast enough to keep up with the CPU, the increased speed of a local SSD makes no difference in load time. If I happen to load a v6 .mat file across the network, I can see that access to the file is 900+ Mbps.
So, any ideas from the community on how to deal with this? What are some best practices that some of you have come up with?
I have already written my own "save" function that will attempt to save my variables as a v6, then v7, then v7.3 .mat file so that it defaults to uncompressed data if it can. Besides that, I would love to be able to deal with the v7.3 files in a much better way.
The best idea I can think of right now would be to determine which of my variables are too big for an uncompressed file and then save them separately with the hopes that I don't need to access them very much in the future. Then (hopefully) the parts that I do continually load for further analysis can be in an uncompressed format. The problem: how do I properly keep track of the increase in number of .mat files per experiment? I would love to hear suggestions about this.
Also, can Mathworks work on multi-threading the 'load' command?

Sign in to comment.

Answers (1)

Douglas on 3 Apr 2017
Hi everybody. I'm having the same problem. While saving .mat (~2.4 GB) files using -v7.3 to a shared folder (using NFS to share the folder between machines) it achieves only 0.5 MB/s in a GB network. Does anybody know if the Matlab team has released some workaround?


Sign in to comment.