Why does the "trainbr" function not require a validation dataset?
Show older comments
I read that the "trainbr" function in MATLAB (Bayesian regularization back-propagation algorithm for neural network training) does not require a validation dataset, and that in the MATLAB implementation of this algorithm, the validation stops are disabled by default. Please kindly explain to me in a little detail why validation is not necessary for using this neural network training function.
Accepted Answer
More Answers (1)
Greg Heath
on 16 Jun 2018
Edited: Greg Heath
on 16 Jun 2018
OVERFITTING + OVERTRAINING combine to form an ugly monster that prevents nets from performing well on nontraining data. Training is so precise that performance is excellent on training data at the expense of performing poorly on nontraining data. Typically, training becomes long and the weights become very large in order to obtain such precision.
Since the net is usually designed to work well on unseen data, several techniques have been invented to prevent overtraining an overfit net. These techniques fall under the term
GENERALIZATION
Minimize the training error subject to one of the
following constraints:
1. NONOVERFITTING:
Minimize the number of weights used.
2. VALIDATION STOPPING (AKA EARLY STOPPING) MATLAB DEFAULT:
Minimize the training subset error until convergence OR the error on the validation subset starts to increase for a specified number of epochs.
3. BAYESIAN REGULARIZATION (MATLAB'S TRAINBR):
Minimize the
sum of squared errors
and either
a. the weighted sum of squared weights
or
b. the weighted sum of absolute weight values
Hope this helps.
Thank you for formally accepting my answer
Greg
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!