what is EPOCH in neural network
Show older comments
HI what is the definition of EPOCH. is it just an iteration. for each epoch, all the data sets go for training with assumed weights and biases.?? OR, does epoch has a size also, where after that size, the weights adgustments happen and runs again for other size.
Pl clarify me.
Thanks, Charu
2 Comments
Shubhankar Kapoor
on 6 Mar 2016
So if we have more number of epoch the better result it is?
Alex Newman Veloso dos Santos
on 28 Dec 2016
Edited: Alex Newman Veloso dos Santos
on 28 Dec 2016
There is not a straight forward answer to this.
Not necessarily. Clearly, if you have a small number of epochs in your training, it would be poor and you would realize the effects of underfitting. On the other hand, if you train the network too much, it would 'memorize' the desired outputs for the training inputs (supposing a supervised learning).
One could also use an optimization method to estimate weights prior to training and get excellent results with only two epochs.
Accepted Answer
More Answers (0)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!