How to avoid memory problem while processing huge table?

1 view (last 30 days)
I have a huge observation table with around 30 Lacs of rows and 12 columns. While training knn classifier in 2016a version, I am getting errors related to memory. Is there any way to avoid this? I have tried to reduce rows but it's affecting the output quality.
Each row in table is a pixel and it's other values as features in columns. In one set of MRI scan, there are around 20 images of 512x512, I am loading one set for creating observation table. Is there another way to pass large amount of data to knn classifier?

Answers (1)

KSSV
KSSV on 31 Aug 2016
doc datastore, memmap, mapreduce.
  1 Comment
Nitinkumar Ambekar
Nitinkumar Ambekar on 1 Sep 2016
Thanks @Dr. Siva, one small query: Can I pass one of these to a function which takes `table` or `matrix`?

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!