You are now following this Submission
- You will see updates in your followed content feed
- You may receive emails, depending on your communication preferences
The training input vectors and target vectors are read from files data1in and data1out respectively. The no of nodes in input and output layer is decided depending on the no. of rows in these datasets.
The no of hidden layers, No of nodes in each hidden layer and the target error (put 0.1) is to be input by the user.
Learning curve is plotted after every 100 epochs.
Learning factor can be varied using the slider at the bottom. This idea was picked from an algorithm created by by AliReza KashaniPour & Phil Brierley.
Activation function for hidden layers is logsig and linear for output layer!
Just press F5 and ve funn!
anshuman0387[at]yahoo[dot]com :)
Cite As
Anshuman Gupta (2026). Back Propogation Algorithm (https://www.mathworks.com/matlabcentral/fileexchange/23528-back-propogation-algorithm), MATLAB Central File Exchange. Retrieved .
Acknowledgements
Inspired by: Function Approximation Using Neural Network Without using Toolbox
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and MATLAB Answers
General Information
- Version 1.0.0.0 (5.03 KB)
MATLAB Release Compatibility
- Compatible with any release
Platform Compatibility
- Windows
- macOS
- Linux
| Version | Published | Release Notes | Action |
|---|---|---|---|
| 1.0.0.0 |
