Perceptron Learning
Version 1.0.0.0 (22.2 KB) by
Bhartendu
Perceptron Learning rule, (Artificial Neural Networks)
When comparing with the network output with desired output, if there is error the weight vector w(k) associated with the ith processing unit at the time instant k is corrected (adjusted) as
w(k+1) = w(k) + D[w(k)]
where, D[w(k)] is the change in the weight vector and will be explicitly given for various learning rules.
Perceptron Learning rule is given by:
w(k+1) = w(k) + eta*[ y(k) - sgn(w'(k)*x(k)) ]*x(k)
Cite As
Bhartendu (2026). Perceptron Learning (https://www.mathworks.com/matlabcentral/fileexchange/63046-perceptron-learning), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Created with
R2016a
Compatible with any release
Platform Compatibility
Windows macOS LinuxCategories
- AI and Statistics > Deep Learning Toolbox > Train Deep Neural Networks > Function Approximation, Clustering, and Control > Function Approximation and Clustering > Define Shallow Neural Network Architectures >
Find more on Define Shallow Neural Network Architectures in Help Center and MATLAB Answers
Tags
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
| Version | Published | Release Notes | |
|---|---|---|---|
| 1.0.0.0 |
