Thanks a lot Mr. Greg.
I have one more query
I am confused about to use train , validation , test matrix in Patternnet function. I wrote a code of 10 fold cross validation of my input and target matrix , So it will create 10 set of train , validation , test matrix which i have to use as inputs (p) and targets(t) in the patternet program code which i got from advanced script of nprtool. In the code there option only to use p and t in the training and data division but I want to put train , test & validation matrix of input and target seperately in the train function . How can modify the advanced script to do that ?
Thanks a lot.
"Md" wrote in message <l1v3jd$dsb$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <l1tfem$pss$1@newscl01ah.mathworks.com>...
> > "Md" wrote in message <l1o3qo$6f9$1@newscl01ah.mathworks.com>...
> > >
> > > I am doing classification of 2 class using MLP NN in MATLAB.
> > > I am worried about the following items:
> > > a) In the program , How it can Initial weight and bias of input and hidden layer be assign manually ? I am getting some abnormal/unconsistent result.
> >
> > net.IW = IW0
> > net.LW = LW0
> > net.b = b0
> >
> > However, I do not recommend this. The automatic random assignments usually suffice.
> > I always design 10 nets with different random initial weights to mitigate the
> > occasional bad random start. As the number of hidden nodes increases, the percentage of bad starting points should diminish.
> >
> > > b) How i can plot MSE vs no iteraton up to 1000. I want see MSE is constant even after converge at certain iteration.?
> >
> > [ net tr ] =train(net,x,t);
> > tr=tr % No semicolon just to see what training record goodies are available
> > figure(plotnumber)
> > % for epochs 0:M
> > subplot(311)
> > plot(1:N+1, tr.perf(1:M+1); % training
> > subplot(312)
> > plot(1:N+1, tr.vperf(1:M+1); % validation
> > subplot(313)
> > plot(1:N+1, tr.tperf(1:M+1); % testing
> >
> > > c) How I can set optimum number of hidden neuron in the hidden layer?
> >
> > If you have enough data, use many more training equations, than unknown weights
> > Otherwise, use a validation set or regularization.
> >
> > http://www.mathworks.com/matlabcentral/newsreader/view_thread/331662#911472
> >
> > > d) Also, How I can do the 10 fold cross validation for the above data to get optimum error? Is it neccessary to do 10 fold cross validation in NN toolbox MATLAB like other classifier ? anybody has clues/videos/ideas?
> >
> > There is no XVAL function in the toolbox. You could try to code one using CROSSVAL or CVPARTITION. However, I just create multiple designs using random weight divisions and random weight initializations until the estimates of the running mean and standard deviation of the MSE stabilizes.
> >
> > Hope this helps.
> >
> > Greg
> > > Thanks a lot.
>
> Thank you very much for your reply . I understood a lot since i am new in ANN MATLAB toolbox But still I have the following comments:
> For a) i have written in my code
> net = patternnet(hiddenLayerSize);
> net = configure(net,p,t);
> IW = 0.01*randn(H,I);
> b1 = 0.01*randn(H,1);
> LW = 0.01*randn(O,H);
> b2 = 0.01*randn(O,1);
> net.IW{1,1} = IW;
> net.b{1,1} = b1;
> net.LW{2,1} = LW;
> net.b{2,1} = b2;
> then I have written [net tr Y E] = train(net,p,t ) , Is that okay for initialization of weight and bias ?
>
> for b) plot(1:N+1, tr.perf(1:M+1); % training ; what is N? is that total sample number? Also I dont have tr.perf in MATLAB 2012b. I have trainPerformance = perform(net,trainTargets,outputs)
> valPerformance = perform(net,valTargets,outputs)
> testPerformance = perform(net,testTargets,outputs).
> Please can you write in details with code . I didnot get the said plot.
>
> For d) Is cross validation neccessary/urgent for MATLAB 2012b toolbox? How can I overcome the problem of not to use all of my samples that will be used randomly as data partition. Can you explain elaborately?
>
> Thanks for your cooperation Mr. Heath.
