Can i create 3 or 4 layer using nntool????

20 views (last 30 days)
Daud
Daud on 4 Sep 2012
Answered: Jordan on 20 Nov 2013
i am new to neural network; i want create more than 3 or 4 hidden layer neural structure. i have used nntool; but it allows only 2 layers along with allowing only to change the layer(1) neurons; layer(2) neurons box is fade out. i need some explanation about this matter.
for creating multilayer network is it necessary to use "command-line" approach rather than GUI approach(nntool or nprtool or nftool)?
  2 Comments
Walter Roberson
Walter Roberson on 4 Sep 2012
To clarify, do you want 3 (or 4) total layers, or do you want 3 (or 4) of the layers to be hidden, together with the non-hidden layers?
SMIT
SMIT on 8 Oct 2012
Daud have u found the solution for including more than one hidden layer? If u have got something share please help as i am also having the same problem

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 4 Sep 2012
The neural net with one hidden layer is a universal approximator.
It will fit any bounded piecewise continuous function with a finite number of hidden nodes.
To prevent memorization of contamination via noise, measurement error, etc, and to improve prediction capability to unseen data, it is wise to use as few hidden nodes as possible.
One reason to use more than 1 hidden layer is when the training is customized to a special kind of problem for which the role of each layer has a particular physical or mathematical signficance.
Another reason to use more than 1 hidden layer is that the total number of unknown weights to be estimated can be smaller. Consequently, fewer training data I/O pairs would be necessary for a good design.
A good rule of thumb for I-H-O and I-H1-H2-O designs is that the number of training equations, Neq = Ntrn*O be much larger than the number of unknown weights(including biases)to be estimated
Nw1 = (I+1)*H+(H+1)*O = O + (I+O+1)*H for 1 hidden layer
Nw2 = (I+1)*H1+(H1+1)*H2+(H2+1)*O
= O + (I+1)*H1 + H1*H2 + (O+1)*H2 for 2 hidden layers
For 1 hidden layer the rule of thumb yields
H << (Neq-O)/(I+O+1)
Otherwise Early Stopping using a validation set or regularization using trainbr can be used.
If, instead, an extra layer is used, I don't know of any commonly accepted rules for choosing H1 and H2 except Nw2 << Neq.
Hope this helps.
Greg

More Answers (4)

SMIT
SMIT on 8 Oct 2012
Daud have u found the solution for including more than one hidden layer? If u have got something share please help as i am also having the same problem
  3 Comments
SMIT
SMIT on 16 Oct 2012
My input dataset is something like this p=[0 0 0 0 0 2324 0 0 0 0; 8486 0 0 0 0 0 0 0 0 0;]and so on similarly t is also of same type then what will be my minmax(p), because i have the huge dataset to work with. Is it necessary to use minmax??
regards
Greg Heath
Greg Heath on 19 Oct 2012
The syntax NEWFF(minmax(p),[H O]) is very obsolete. So is it's replacement NEWFF(p,t,H). Use the current function FITNET(H) for regression or curve fitting and PATTERNNET(H) for classification and pattern recognition.
help fitnet
doc fitnet
Again, 1 hidden layer is sufficient. However, the optimal number of hidden nodes has to be found by trial and error.
Hope this helps.
Greg

Sign in to comment.


SMIT
SMIT on 31 Oct 2012
Dear Greg,
I am training my network and it is showing good results for few input vectors.
When i tried to increase the input dataset it shows "*maximum mu reached*". I found this to be a good sign that my algorithm has truely conversed. But i have to increase my dataset for further work.
Can you tell me how to handle this problem. I have gone through "*help trainbr*" but i did not got the understandable solution.
Pls Help
  1 Comment
Greg Heath
Greg Heath on 31 Oct 2012
I don't think you can blindly accept maximum mu reached as a sign of convergence. What is the resulting normalized MSE:
NMSE = mse(y-t)/mean(var(t')).
Why do you think performance will significantly decrease if you increase the size of the data?
If the additional data can be assumed to be randomly drawn from the same parent population, the design should improve.
If it can't then it should not be used with this net.

Sign in to comment.


Chris
Chris on 28 Oct 2013
I hope this will help people who still have this problem : It is possible to create any number of layers using nntool GUI.
Simply after specifying the number of layers in the text input field click ENTER key and the drop down menu will also get updated with the number you desire.
If you only enter the number of layers and click with your mouse elsewhere, it wont get updated.

Jordan
Jordan on 20 Nov 2013
I'm working with R2012b version of matlab. The mathworks documentation will show you how to include multiple layers using feedforwardnet. There is a particular form of feedforwardnet called patternnet that works well for classification problems. Ex: feedforwardnet([10,8]) will build a network with two hidden layers. The first hidden layer will be size 10 and the second hidden layer will be size 8.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!