Clear Filters
Clear Filters

Can I tell trainNetwork() to return the net with the lowest validation or training loss, as opposed to the net at the last iteration?

4 views (last 30 days)
Sometimes training diverges and it's wasteful to re-run with a specific stopping epoch. I could use checkpoints and sift through the trainingInfo to find the lowest loss, but that carries costs in memory and time. Also I am using the Experiment Manager and I'd have to add special code to load the correct checkpoint in my custom metric function. Having a training option to return the net that minimizes the loss would allow me to easily compare multiple experiments, where divergence may occur at different training epochs. Also, minimizing the loss is the definition of training, so it just makes sense.
  1 Comment
Nethtrick
Nethtrick on 22 Sep 2020
A similar question:
I have implemented the checkpoint approach, but am also running into the same issue with batch normalization layers not being defined. So the checkpoint approach is NOT a workaround when your network has batch normalization layers.

Sign in to comment.

Answers (1)

Madhav Thakker
Madhav Thakker on 25 Sep 2020
Hi Nethrick,
As of now, it is not possible to save the network with the least validation error in DAGNetwork.
However, you might want to explore Custom Training Loops -
  1. Custom Training Loops with dlnetwork (More flexible that trainNetwork/DAGNetwork)
  2. Custom Training Loop with Model as Function (Most flexible)
Examples (R2020a) for these features are available here:
In order to use these more flexible tools, it is suggested to get the newest release.
You can also have a look at https://www.mathworks.com/help/deeplearning/ug/customize-output-during-deep-learning-training.html which has an example of defining the output function which stops network training if the best classification accuracy on the validation data does not improve for N network validations in a row.
Hope this helps.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!