Is validation set being used for training in NN?
Show older comments
I'm using the Neural Network toolbox and the "divideind" function. I split the all set into train and validation sets to use the early stopping criteria:
net.divideFcn='divideind'; [net.divideParam.trainInd,net.divideParam.valInd,net.divideParam.testInd] = divideind(10000,1:7000,7001:10000);
The thing is I thought the training was performed just with the training set, and each epoch the validation performance was computed, but I realised the NN is being influenced by the all set (training+validation). I really don't know how and I'd like to change it! I tried: [net.divideParam.trainInd,net.divideParam.valInd,net.divideParam.testInd] = divideind(10000,1:7000,7001:10000); [net.divideParam.trainInd,net.divideParam.valInd,net.divideParam.testInd] = divideind(7000,1:7000); and they give different results (for the same epochs). Using the all set for training the results are more similar but yet different: [net.divideParam.trainInd,net.divideParam.valInd,net.divideParam.testInd] = divideind(10000,1:10000); And because of this I'm computing the test error separately to be sure the test set is not being used to training!
Do you know what is happening? Do you think I can do it as I need, i.e. 7000 just for training and 3000 just for validation? Thank you!
Accepted Answer
More Answers (0)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!