Example code from "Handling Large Data Sets Efficiently in MATLAB " webinar describing strategies for handling large amounts of data in MATLAB and avoiding "out-of-memory" errors. It will provide you with an understanding of the causes of memory limitations in MATLAB and a set of techniques to increase the available memory in MATLAB. It will also show techniques for minimizing memory usage in MATLAB while accessing, storing, processing, and plotting data.
This information is also available (and updated) in technical note 1107:
Specific topics include:
* Understanding the maximum size of an array and the workspace in MATLAB
* Using undocumented features to show you the available memory in MATLAB
* Setting the 3GB switch under Windows XP to get 1GB more memory for MATLAB
* Using textscan to read large text files and memory mapping feature to read large binary files
Please watch the webinar for which this is the code. The first 40 min, as you correctly say, focuses on maximizing the available memory for MATLAB. The latter 25 min focuses on minimizing MATLAB's memory usage, processing a large data set in a few ways. BTW MATLAB's out of memory errors are not directly affected by available RAM.
Okay... the title says: "Handling Large Data Sets Efficiently in MATLAB".
Actually, it just shows information on how to "Handle" the system memory in order to provide more RAM for MATLAB to work with.
Not even one word is spent on an arbitrary "Data Set".
Nevertheless, this guide is useful for the above.
Good work, wrong topic.
Replace the title by something like "Providing MATLAB with more RAM in order not to run out of memory" or something similar.
A superb review of memory management for those really
big problems. He explains what the limits are, why those
limits exist, and what to do if its possible to do anything.
Its well worth your time to read.