Why MATLAB uses the Newton’s Gradient Descent as the solver for fitcnet?

6 views (last 30 days)
I want to train a neural network to perform image segmentation using fitcnet. I noticed that MATLAB uses the Newton’s Gradient Descent solver (LBFGS) to perform parameters optimization. Does anyone know why MATLAB selected this method over other optimization methods such as Adam or SGDM?
I would appreciate any help you could provide.

Accepted Answer

Lucas García
Lucas García on 20 Feb 2023
L-BFGS is used both in in fitcnet and fitrnet. These functions ship in Statistics and Machine Learning Toolbox and allow you to get started in solving Machine Learning problems using Neural Networks.
For more customization capabilities, more advanced architectures and additional solver methods, you may use Deep Learning Toolbox. This includes the use of solvers such as SGDM, RMSProp or Adam. See trainingOptions for more details.
  3 Comments
Memo Remo
Memo Remo on 23 Feb 2023
Dear Lucas and Walter,
Thank you very much for your responses. They are very helpful.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!