File Exchange

image thumbnail

MLP Neural Network trained by backpropagation

version (2.15 KB) by Mo Chen
Multilayer Perceptron (MLP) Neural Network (NN) for regression problem trained by backpropagation (backprop)


Updated 21 Nov 2018

View License

Very compact implementation of backpropagation for MLP regression. Mean to be read and learn.

This package is a part of the PRML toolbox (

Comments and Ratings (17)

Hi all,
I used this code to train a sample of 8 inputs and one output and it worked. However, I am wondering about how to use it to predict using testing data.
anybody can help, please!

Nice, but not compatible with R2013b, it's not true compatible with any release

I ran the demo, but I got
>> mlp_demo
Error using +
Matrix dimensions must agree.

Error in mlpReg (line 33)
Z{t+1} = tanh(W{t}'*Z{t}+b{t});

Error in mlp_demo (line 8)
[model, L] = mlpReg(x,y,k);

Excellent and simple

Mo Chen

@dsmalenb, [4,5] means, two hidden layer, one with 4 nodes, and one with 5 nodes


Dumb question - does "h = [4,5]" mean "4 neurons in 5 layers" or "4 layers with 5 neurons". I can't deduce this by reading the code. It is nice and compact but that point is not clear to me.

hi, every body, why not do the following? please explain it, thanks
E = W{l}*dG;


hana razak

nima safari

Ignas A.


Amazing - works really well and is super compact in terms of code. Great work!



change title

rewrite for regression

MATLAB Release Compatibility
Created with R2018b
Compatible with any release
Platform Compatibility
Windows macOS Linux

Inspired by: Pattern Recognition and Machine Learning Toolbox

Discover Live Editor

Create scripts with code, output, and formatted text in a single executable document.

Learn About Live Editor