what is EPOCH in neural network

HI what is the definition of EPOCH. is it just an iteration. for each epoch, all the data sets go for training with assumed weights and biases.?? OR, does epoch has a size also, where after that size, the weights adgustments happen and runs again for other size.
Pl clarify me.
Thanks, Charu

2 Comments

So if we have more number of epoch the better result it is?
There is not a straight forward answer to this.
Not necessarily. Clearly, if you have a small number of epochs in your training, it would be poor and you would realize the effects of underfitting. On the other hand, if you train the network too much, it would 'memorize' the desired outputs for the training inputs (supposing a supervised learning).
One could also use an optimization method to estimate weights prior to training and get excellent results with only two epochs.

Sign in to comment.

 Accepted Answer

Greg Heath
Greg Heath on 9 Feb 2013
Edited: Greg Heath on 10 Feb 2013
An epoch is a measure of the number of times all of the training vectors are used once to update the weights.
For batch training all of the training samples pass through the learning algorithm simultaneously in one epoch before weights are updated.
help/doc trainlm
For sequential training all of the weights are updated after each training vector is sequentially passed through the training algorithm.
help/doc adapt
Hope this helps.
Thank you for formally accepting my answer
Greg
P.S. The comp.ai.neural-nets FAQ can very helpfull for understanding NN terminology and techniques.
Greg

4 Comments

Ayomi
Ayomi on 11 Aug 2018
Edited: Ayomi on 11 Aug 2018
Dear Greg, I would like to ask how the maximum number of iterations and the number of iterations per epoch are set for network training? The training options allow me to choose maximum number of Epoch and the size of the batch but not the other two I mentioned.
In addition, Just to double check on my understanding of 'Epoch', If I set my batch size to the size of the my entire training dataset (use batch Gradient Descent), 1 epoch is a 1 iteration (1 GD step), correct?
> Which of the matlab training functions are you using?
I just use as many defaults as possible.
Most of the times, the only things I change are
1. Number of hidden nodes by trial and error
j =0
for H= Hmin:dH:Hmax
j=j+1
. . .
2. For each H, 10 trials of random initial weights
for Ntrials = 1:10
. . .
end
end
I have many examples in both
comp.ai.soft-sys.matlab
I'm using the NN toolbox functions 'trainedNet = trainNetwork(X,Y,layers,options)' and 'options = trainingOptions(solverName)'. Options allow me to choose maximum number of Epoch and the size of the batch but nor the number of iterations per epoch as presented in https://ww2.mathworks.cn/help/nnet/ref/trainingoptions.html
In general, training is a succession of "try then modify" steps. If unsure, start with defaults.

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

on 8 Feb 2013

Commented:

on 12 Aug 2018

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!