Why does the neural network training end before reaching the specified maximum number of epochs?
2 views (last 30 days)
Show older comments
MathWorks Support Team
on 9 Nov 2017
Edited: MathWorks Support Team
on 7 Sep 2021
Why does the neural network training end before reaching the specified maximum number of epochs?
This is how I am setting the training option:
options = trainingOptions('sgdm', 'MiniBatchSize',miniBatchSize,'MaxEpochs',4000)
But, it looks like the training ended without reaching the max epoch. Is this normal? And what will actually affect the total epoch number in the training?
Accepted Answer
MathWorks Support Team
on 18 Aug 2021
Edited: MathWorks Support Team
on 7 Sep 2021
There are many parameters that can cause a neural network to stop training.
As you may know, an epoch is the full pass of the training algorithm over the entire training set. In general, the training will stop before reaching the specified maximum number of epochs to avoid overfitting to the data, thus improving the network generalization. That is, the training will stop if the results of the cross validation are not getting any better (within some tolerance).
Please refer to the following link for more information on the early stopping behavior to improving generalization of the network:
0 Comments
More Answers (0)
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!