Reduce learning rate after certain number of epoch

Hi, I have a question about reducing the learning rate or another way to enhance accuracy during training a deep learning model.
Suppose the loss metric does not improve after a defined epoch (for example 8). Is there a way to reduce the learning rate?
or
If the accuracy metric (or loss metric) fails to improve after a certain number of epochs, give another solution to improve accuracy.
Thank you all for your time and consideration

Answers (1)

Check out the example Specify Training Options to see how to reduce the learning rate after a certain number of epochs.

2 Comments

Hey @Sivylla Paraskevopoulou what about REINFORCEMENT LEARNING TRAINING?
train requires rlTrainingOptions and this one doesn't have that decayment option. Do you any have a tip?

Sign in to comment.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

on 11 Oct 2022

Commented:

on 29 Jul 2024

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!