You are now following this Submission
- You will see updates in your followed content feed
- You may receive emails, depending on your communication preferences
This code implements the basic back propagation of error learning algorithm. the network has tanh hidden neurons and a linear output neuron, and applied for predicting y=sin(2pix1)*sin(2pix2).
We didn't use any feature of neural network toolbox.
Cite As
Alireza (2026). Function Approximation Using Neural Network Without using Toolbox (https://www.mathworks.com/matlabcentral/fileexchange/17355-function-approximation-using-neural-network-without-using-toolbox), MATLAB Central File Exchange. Retrieved .
Acknowledgements
Inspired: Orthogonal Least Squares Algorithm for RBF Networks, Back Propogation Algorithm
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and MATLAB Answers
General Information
- Version 1.0.0.0 (118 KB)
MATLAB Release Compatibility
- Compatible with any release
Platform Compatibility
- Windows
- macOS
- Linux
| Version | Published | Release Notes | Action |
|---|---|---|---|
| 1.0.0.0 | BSD License |
