I recently inherited a set of Matlab scripts that analyze some fairly large datasets. They were written on a Windows 10, core i5 x64 system with 8GB of RAM using Matlab R2016b, and ran to completion without issue. After setting up a new system (Windows 10, core i7 x64, 16GB RAM), I installed Matlab R2017a and ran into "out of memory" errors running the same set of scripts that worked fine on the older system.
Specifically, both systems cruise along for most of the script with Matlab using ~4GB of memory. At a specific point in the script that requires additional resources, the "old" system (r2016b) manages to allocate an additional ~2GB of memory (thereby using ~6GB out of its total 8GB), while the new r2017a system (with 16GB total, and a full 8GB remaining either free or in standby) throws an "out of memory" error instead. The dataset used is identical in both cases.
The behaviour is the same after Windows has freshly booted, with no other applications running. Importantly, I have now also installed r2016b on the new machine, and the scripts run to completion in r2016b without issue (so it would appear that it is the version of Matlab, not the new hardware or some sort of windows misconfiguration that is causing the problem).
While I am well aware of the benefits of optimizing memory management, particularly with large datasets, that is not the question I need answered at the moment (I will dig into optimization down the road, regardless). The specific issue I would like to resolve is whether Matlab 2017a handles memory allocation differently than 2016b, and/or if there is some configuration option I might be unaware of that could account for this difference? I could stay on r2016b for a while (or install multiple versions), but I'd rather deal with the cause of the problem rather than avoiding it.
Thank you to anyone who might have some ideas on this.