How to generate initial weights for back-propagation neural network (BPNN) using MATLAB R2012a?

3 views (last 30 days)
I am trying to predict future values using BPNN, with 1 input and output neuron respectively and also 1 layer of hidden layer. And I change the number of hidden neurons from 2 to 10. For example, I set the number of hidden neurons (N) as 10 first. So, I want to generate the weights (10 weights from input to hidden, and 10 from hidden to output),and I write this:
s=rng; m=rand(10,2)
Is this the correct way? However, when I change the N to 7, it gives me the same predicted values as N=10.

Accepted Answer

Greg Heath
Greg Heath on 27 Mar 2015
The reason why the answers are different is because the state of the random number generator changes each time it is called.
Therefore if you want reproducibility, initialize the state of the rng before the first call of configure or train.
Most of my serious designs involve searching over multiple designs for the best combination of random initial weights, random trn/val/tst data divisions and number of hidden nodes.
I have posted scores of examples in the NEWSGROUP and ANSWERS. Search on
greg rng('default')
or
greg rng(0)
Hope this helps
Thank you for formally accepting my answer
Greg

More Answers (1)

Greg Heath
Greg Heath on 19 Mar 2015
Most of the neural network functions create their own initial weights.
See the documentation examples
help fitnet
doc fitnet
Or, if by prediction you mean predict future values,
help narxnet
doc narxnet
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 Comment
soo rachael
soo rachael on 19 Mar 2015
But, I got much different predicted values even I use all the parameters the same. It was not stable. So, I think of fixing my initial weights. How could I do this?

Sign in to comment.

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!