Understanding the SGD optimization from scratch
Version 1.0.0 (23.8 KB) by
Mohammad Jamhuri
In this program, we train a single-layer neural network to classify the Iris dataset using Stochastic Gradient Descent (SGD) from scratch.
In this code, we demonstrate a step-by-step process of using Stochastic Gradient Descent (SGD) to optimize the loss function of a single-layer neural network. Additionally, we build the neural network from scratch, providing a clear understanding of its inner workings and implementation.
Cite As
Mohammad Jamhuri (2026). Understanding the SGD optimization from scratch (https://www.mathworks.com/matlabcentral/fileexchange/128043-understanding-the-sgd-optimization-from-scratch), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Created with
R2023a
Compatible with any release
Platform Compatibility
Windows macOS LinuxTags
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
| Version | Published | Release Notes | |
|---|---|---|---|
| 1.0.0 |
