what is EPOCH in neural network

226 views (last 30 days)
Charu on 8 Feb 2013
Commented: Greg Heath on 12 Aug 2018
HI what is the definition of EPOCH. is it just an iteration. for each epoch, all the data sets go for training with assumed weights and biases.?? OR, does epoch has a size also, where after that size, the weights adgustments happen and runs again for other size.
Pl clarify me.
Thanks, Charu
Alex Newman Veloso dos Santos
There is not a straight forward answer to this.
Not necessarily. Clearly, if you have a small number of epochs in your training, it would be poor and you would realize the effects of underfitting. On the other hand, if you train the network too much, it would 'memorize' the desired outputs for the training inputs (supposing a supervised learning).
One could also use an optimization method to estimate weights prior to training and get excellent results with only two epochs.

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 9 Feb 2013
Edited: Greg Heath on 10 Feb 2013
An epoch is a measure of the number of times all of the training vectors are used once to update the weights.
For batch training all of the training samples pass through the learning algorithm simultaneously in one epoch before weights are updated.
help/doc trainlm
For sequential training all of the weights are updated after each training vector is sequentially passed through the training algorithm.
help/doc adapt
Hope this helps.
Thank you for formally accepting my answer
P.S. The comp.ai.neural-nets FAQ can very helpfull for understanding NN terminology and techniques.
Greg Heath
Greg Heath on 12 Aug 2018
In general, training is a succession of "try then modify" steps. If unsure, start with defaults.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!