"I have a program that is memory intensive and I want to monitor how much memory MATLAB is using so that if it goes above a certain threshold, I can stop the program."
Try setting the MATLAB workspace preferences, the array size-limit might be suitable for use as a control.
Though this will stop the script before the limit actually is violated, e.g. if you tell it to create an NxM array in a script and given the class this would require say 900GB of ram in a system with < 900GB of ram, with the array size limit active, not using tall arrays etc., Matlab will throw an exception and stop the script without actually creating the array. It will incrementally increase an existing array in size until the limit is threatened by the next instruction to be executed. It will allow the total memory used by Matlab to exceed that limit, though, which can be wrapped in a try..catch loop for elegant resolution. This might be a functional workaround which requires minimal additional coding.
Of course you could code a test for this on your own and insert it before any step that you feel might be a problem.
while M*N=50e9 && M< Mlim && N < Nlim
not going to resolve a problem with intermediate calculations that would trip the limit,
but at least it would be a check on the final result...
if you want a check on the intermediate calculations, just use a try-catch Me loop and examine the error returned
...this is enough to put my system in a confused state...it will take the available memory down to 30Mb or so...on a 16GB i5...
Task Manager (and the processes that use it) will never show that intermediate memory level triggered failure because the ram required to trigger the failure won't actually get used if the code would violate the limit in trying to use it. But if not, then Matlab will happily execute the code...even if the result is that 99.999% of available ram gets used...
The command does not fail because the required trigger memory never gets used and so it does not crashes the system. MATLAB will screen the code for the potential problem and abort before actually executing such code. The point is to keep that condition from happening in the first place, but that is not the only condition that is a problem. Any machine-state with low responsiveness is a problem unless you are willing to go have a cup of coffee or something. Before MATLAB was "improved", it would happily create the requested array even if it meant using "disk memory" (from a swapfile, the opposite of a ramdisk) and you'd have to reboot your system to clear the RAM and get it to respond...MATLAB wouldn't actually crash the system as long as there was sufficient "disk memory" available to complete the command but the system was so slow to do anything that you'd want it to crash. That is why the above limit is based on installed physical ram, not "available ram". This still will not resolve the problem completely as it won't track either non-Matlab use of system memory or "available ram" (with or without a swapfile). And if you have a swapfile (or as Win10 likes to call it, a "paged pool") then you're toast because then the OS will happily swap application and even OS memory out to disk in order to free-up ram for Matlab. That is more "robust" but it just changes what has been swapped-out by the OS, not the fact that the OS had to resort to a swap-out in the first place. The ideal solution of course is to be aware of how much ram (code and data) an instruction will require to complete before actually asking MATLAB to run the command, and to simply adjust the parameters (and kill-off any unnecessary background apps) before calling an instruction so that it will run comfortably in the available memory space. Failing that, double the amount of ram in your system until this is no longer a problem. If it needs 6GB and you only have 4, going to 8 will solve the problem.