To test the software, see the included script for a simple multi-layer perceptron.
The following optimization algorithms are implemented: AMSgrad, AdaMax, Adadelta, Adam, Delta-bar Delta, Nadam, and RMSprop.
John Malik (2020). Gradient Descent Optimization (https://github.com/jrvmalik/gradient-descent), GitHub. Retrieved .
Inspired: Classic Optimization
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!