Neural Network training with Adam optimizer from scratch

Full code for training and testing of a simple neural network on the MNIST data set for digit recognition.
254 Downloads
Updated 14 Apr 2021

View License

Full code for training and testing of a simple neural network on the MNIST data set for recognition of single digits between 0 and 9 (Accuracy around 98 %). Everything is implemented from scratch, including the Adam optimizer. Make sure all the files are in your current folder and run "train.m".

Check out http://neuralnetworksanddeeplearning.com/index.html to learn about the theory of neural networks and https://arxiv.org/abs/1412.6980 to understand the Adam optimizer!

Cite As

Johannes Langelaar (2024). Neural Network training with Adam optimizer from scratch (https://www.mathworks.com/matlabcentral/fileexchange/90461-neural-network-training-with-adam-optimizer-from-scratch), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2019a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

neural_network_mninst

Version Published Release Notes
1.0.0