some clarifications on ANN

5 views (last 30 days)
Ampi
Ampi on 30 Oct 2012
Mr Greg, Thanks for your answer. But I needed to make one clarification. I had asked the following question:- Hello could you please help me out with the answer of a question? 1. Say I am performing Face Recognition using PCA, now I have found out say 100 vectors i.e. eigenvectors of few classes. I have also set up the target matrix to train those vectors. Now, my question is when I am setting up the training ststem I have wriiten the matlab command as:- net=newff(final,target,9) where 9 is no. of layers of perceptrons, where final is the tarining samples. Now since I have 100 sample vectors , I may increase the no of vectors, so my question is should I increase the layers of perceptrons or how should I choose the 3rd argument in newff function. For training of 100 vectors is 9 layer of perceptrons ok? I shall be grateful to you if you kindly answer my question The answer given to me was:- Design an I-H-O MLP for classification of O = c classes:
Use newpr (calls newff) or patternnet (calls feedforward net)
Input matrix x contains N I-dimensional column vectors
Target matrix t contains N O-dimensional unit column vectors with the row of the "1" indicating the class of the corresponding input vector.
Ntrn = 0.7*N % Default number of training examples
Ntrneq = Ntrn*O % Number of training equations
Nw = (I+1)*H +)H+1)*O % Number of unknown weights to estimate
H < < (Ntrneq-O)/(I+O+1) % Ntrneq > > Nw is desired
rng(0)
j=0
for h = 1:dH: Hmax
j=j+1
for i = 1:Ntrials
net = newpr(x,t,h);
[net tr ] = train(net,x,t);
% tr = tr % Important diagnostic info when needed
y = net(x);
classes = vec2ind(y);
fill this in
PctErr(i,j) = ...
end
end
Could you please clarify the answer once again. I did not understand it. For training say 52 vectors , it could also be 100 vectors how to decide how many layers of perceptrons should I use for effective training in the following function?
net=newff(final,target,9). Please give me a clarification about is there any ration to be maintained between the no. of perceptrons and the no of training vectors. Thanks in advance for your help.

Accepted Answer

Greg Heath
Greg Heath on 31 Oct 2012
PLEASE DO NOT START A NEW THREAD WITH FOLLOWUP QUESTIONS TO ANOTHER THREAD.
> Could you please clarify the answer once again. I did not understand it.
OK:
1. There are only 2 layers 1 hidden and 1 output
2. The challenge is to choose the smallest number of hidden nodes that will yield satisfactory results.
3. For straightforward training you would like many more output training equations than unknown weights in order to mitigate errors caused by noise, measurement error, etc.
4. If this is not possible, validation stopping and regularized training can be used. The former is the NNTBX default.
> For training say 52 vectors , it could also be 100 vectors how to decide how many layers of perceptrons should I use for effective training in the following function? net=newff(final,target,9).
CORRECTION OF TERMINOLOGY AND MISUNDERSTANDING:
1. Perceptron is the name of a TYPE of network.
2. The standard multilayer perceptron (MLP) has two layers: 1 hidden and 1 output
3. Given the dimensions of input and output vectors (I and O), the main differences between standard MLPs is H, the number of hidden nodes and the values of the corresponding Nw = (I+1)*(H+1)*O connection weights.
4. With Ntrn training vector pairs, there are Ntrneq = Ntrn*O training equations.
5. If you do not use validation stopping or regularized training (MSEREG) it is desirable to have Ntreq >> Nw. Otherwise, H can be larger.
>Please give me a clarification about is there any ration to be maintained between the no. of perceptrons (CORRECTION: HIDDEN NODES) and the no of training vectors.
6. See 5.
> Thanks in advance for your help.
Hope this helps.
Thank you for formally accepting my answer.
Greg
  2 Comments
Ampi
Ampi on 1 Nov 2012
Thanh You Mr Greg
Ampi
Ampi on 1 Nov 2012
Thank you Mr Greg

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!