Suggestions for dealing with large hyperspectral images in Matlab

Hello everyone,
I am currently facing a new batch of hyperspectral images in Matlab. Usually, I have no problems, but this time images of around 2000x2000x324 (double) have flooded me with "Out of memory" messages for all the image processing and statistical analyses I was a happily doing before.
My pc is relatively good ( Intel Core i7 and a total of available RAM of 15.8 GB but says available 7), so I am surprised.
I was wondering if you have any suggestions, strategy for dealing with such files or I am missing something here.
I have been suggested to switch to python for larger files, but with Matlab I have been having an intimate relationship for a while now.
Thank you very much for your support.
E

1 Comment

Depending on how you a processing the files, there is a chance that you might be able to use tall() arrays.
Each of your images appears to take about 9.6 Gb, so you will need to be careful with memory. It would not be uncommon for processing to end up requiring as much memory again (since it is already double) even if only temporarily. Some of the places that you are vectorizing might need to be rewritten as loops.

Sign in to comment.

Answers (0)

Asked:

on 15 Apr 2018

Commented:

on 15 Apr 2018

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!