Understanding the SGD optimization from scratch

In this program, we train a single-layer neural network to classify the Iris dataset using Stochastic Gradient Descent (SGD) from scratch.

You are now following this Submission

In this code, we demonstrate a step-by-step process of using Stochastic Gradient Descent (SGD) to optimize the loss function of a single-layer neural network. Additionally, we build the neural network from scratch, providing a clear understanding of its inner workings and implementation.

Cite As

Mohammad Jamhuri (2026). Understanding the SGD optimization from scratch (https://www.mathworks.com/matlabcentral/fileexchange/128043-understanding-the-sgd-optimization-from-scratch), MATLAB Central File Exchange. Retrieved .

Tags

Add Tags

Add the first tag.

General Information

MATLAB Release Compatibility

  • Compatible with any release

Platform Compatibility

  • Windows
  • macOS
  • Linux
Version Published Release Notes Action
1.0.0