Out of memory during neural network training

9 views (last 30 days)
I know this is a common problem, but all the solutions I have tried have failed.
Basically I want to train a big neural network and I obtain 'Out of memory' error.
My training set is a 729x3456 matrix of doubles and the neural network is a so called 'autoencoder' with layers of these sizes
3456 - 4000 - 2000 - 1000 - 300 - 1000 - 2000 - 4000 - 3456
In my code, first of all I do
net = feedforwardnet([layer1, layer2, layer3, layer4, layer3, layer2, layer1], 'trainscg');
net = configure(net, Dtrain', Dtrain');
where I use the 'trainscg' function because I read that it is the one that uses less memory. Then I initialize the weights and biases according to some values (which I have already calculated), set the 'transferFcn' and start training.
I tried cleaning the workspace as much as possible and I also tried to put
net.efficiency.memoryReduction = 4;
before training, since I read it can help. Anyway I still have 'Out of memory', even if I increase the value to 60.
Here is the output of the command 'memory', executed when the workspace contains just the training set and four numbers (the size of the layers)
>> memory
Maximum possible array: 4508 MB (4.727e+09 bytes) *
Memory available for all arrays: 4508 MB (4.727e+09 bytes) *
Memory used by MATLAB: 1927 MB (2.020e+09 bytes)
Physical Memory (RAM): 8080 MB (8.472e+09 bytes)
* Limited by System Memory (physical + swap file) available.
What else can I do to solve the problem?

Accepted Answer

Greg Heath
Greg Heath on 20 Apr 2015
You will never be able to solve a problem of that size. I suggest
1. Using feature extraction to SUBSTANTIALLY reduce the input dimensionality.
2. Use no more than 1 or 2 hidden layers.
  4 Comments

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!