Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Problem about getting optimum output in Neural Network MATLAB 2012a

Subject: Problem about getting optimum output in Neural Network MATLAB 2012a

From: Md

Date: 23 Sep, 2013 01:00:08

Message: 1 of 6


I am doing classification of 2 class using MLP NN in MATLAB.
I am worried about the following items:
 a) In the program , How it can Initial weight and bias of input and hidden layer be assign manually ? I am getting some abnormal/unconsistent result.
 
b) How i can plot MSE vs no iteraton up to 1000. I want see MSE is constant even after converge at certain iteration.?

c) How I can set optimum number of hidden neuron in the hidden layer?
 
d) Also, How I can do the 10 fold cross validation for the above data to get optimum error? Is it neccessary to do 10 fold cross validation in NN toolbox MATLAB like other classifier ? anybody has clues/videos/ideas?
Thanks a lot.

Subject: Problem about getting optimum output in Neural Network MATLAB 2012a

From: Greg Heath

Date: 25 Sep, 2013 01:49:10

Message: 2 of 6

"Md" wrote in message <l1o3qo$6f9$1@newscl01ah.mathworks.com>...
>
> I am doing classification of 2 class using MLP NN in MATLAB.
> I am worried about the following items:
> a) In the program , How it can Initial weight and bias of input and hidden layer be assign manually ? I am getting some abnormal/unconsistent result.

net.IW = IW0
net.LW = LW0
net.b = b0

However, I do not recommend this. The automatic random assignments usually suffice.
I always design 10 nets with different random initial weights to mitigate the
occasional bad random start. As the number of hidden nodes increases, the percentage of bad starting points should diminish.

> b) How i can plot MSE vs no iteraton up to 1000. I want see MSE is constant even after converge at certain iteration.?

[ net tr ] =train(net,x,t);
tr=tr % No semicolon just to see what training record goodies are available
figure(plotnumber)
% for epochs 0:M
subplot(311)
plot(1:N+1, tr.perf(1:M+1); % training
subplot(312)
plot(1:N+1, tr.vperf(1:M+1); % validation
subplot(313)
plot(1:N+1, tr.tperf(1:M+1); % testing
 
> c) How I can set optimum number of hidden neuron in the hidden layer?

If you have enough data, use many more training equations, than unknown weights
Otherwise, use a validation set or regularization.

http://www.mathworks.com/matlabcentral/newsreader/view_thread/331662#911472

> d) Also, How I can do the 10 fold cross validation for the above data to get optimum error? Is it neccessary to do 10 fold cross validation in NN toolbox MATLAB like other classifier ? anybody has clues/videos/ideas?

There is no XVAL function in the toolbox. You could try to code one using CROSSVAL or CVPARTITION. However, I just create multiple designs using random weight divisions and random weight initializations until the estimates of the running mean and standard deviation of the MSE stabilizes.

Hope this helps.

Greg
> Thanks a lot.

Subject: Problem about getting optimum output in Neural Network MATLAB 2012a

From: Md

Date: 25 Sep, 2013 16:39:09

Message: 3 of 6

"Greg Heath" <heath@alumni.brown.edu> wrote in message <l1tfem$pss$1@newscl01ah.mathworks.com>...
> "Md" wrote in message <l1o3qo$6f9$1@newscl01ah.mathworks.com>...
> >
> > I am doing classification of 2 class using MLP NN in MATLAB.
> > I am worried about the following items:
> > a) In the program , How it can Initial weight and bias of input and hidden layer be assign manually ? I am getting some abnormal/unconsistent result.
>
> net.IW = IW0
> net.LW = LW0
> net.b = b0
>
> However, I do not recommend this. The automatic random assignments usually suffice.
> I always design 10 nets with different random initial weights to mitigate the
> occasional bad random start. As the number of hidden nodes increases, the percentage of bad starting points should diminish.
>
> > b) How i can plot MSE vs no iteraton up to 1000. I want see MSE is constant even after converge at certain iteration.?
>
> [ net tr ] =train(net,x,t);
> tr=tr % No semicolon just to see what training record goodies are available
> figure(plotnumber)
> % for epochs 0:M
> subplot(311)
> plot(1:N+1, tr.perf(1:M+1); % training
> subplot(312)
> plot(1:N+1, tr.vperf(1:M+1); % validation
> subplot(313)
> plot(1:N+1, tr.tperf(1:M+1); % testing
>
> > c) How I can set optimum number of hidden neuron in the hidden layer?
>
> If you have enough data, use many more training equations, than unknown weights
> Otherwise, use a validation set or regularization.
>
> http://www.mathworks.com/matlabcentral/newsreader/view_thread/331662#911472
>
> > d) Also, How I can do the 10 fold cross validation for the above data to get optimum error? Is it neccessary to do 10 fold cross validation in NN toolbox MATLAB like other classifier ? anybody has clues/videos/ideas?
>
> There is no XVAL function in the toolbox. You could try to code one using CROSSVAL or CVPARTITION. However, I just create multiple designs using random weight divisions and random weight initializations until the estimates of the running mean and standard deviation of the MSE stabilizes.
>
> Hope this helps.
>
> Greg
> > Thanks a lot.
 
Thank you very much for your reply . I understood a lot since i am new in ANN MATLAB toolbox But still I have the following comments:
For a) i have written in my code
net = patternnet(hiddenLayerSize);
net = configure(net,p,t);
IW = 0.01*randn(H,I);
b1 = 0.01*randn(H,1);
LW = 0.01*randn(O,H);
b2 = 0.01*randn(O,1);
net.IW{1,1} = IW;
net.b{1,1} = b1;
net.LW{2,1} = LW;
net.b{2,1} = b2;
then I have written [net tr Y E] = train(net,p,t ) , Is that okay for initialization of weight and bias ?

for b) plot(1:N+1, tr.perf(1:M+1); % training ; what is N? is that total sample number? Also I dont have tr.perf in MATLAB 2012b. I have trainPerformance = perform(net,trainTargets,outputs)
 valPerformance = perform(net,valTargets,outputs)
 testPerformance = perform(net,testTargets,outputs).
Please can you write in details with code . I didnot get the said plot.

For d) Is cross validation neccessary/urgent for MATLAB 2012b toolbox? How can I overcome the problem of not to use all of my samples that will be used randomly as data partition. Can you explain elaborately?

Thanks for your cooperation Mr. Heath.

Subject: Problem about getting optimum output in Neural Network MATLAB 2012a

From: Md

Date: 27 Sep, 2013 00:33:07

Message: 4 of 6

Thanks a lot Mr. Greg.
I have one more query--
I am confused about to use train , validation , test matrix in Patternnet function. I wrote a code of 10 fold cross validation of my input and target matrix , So it will create 10 set of train , validation , test matrix which i have to use as inputs (p) and targets(t) in the patternet program code which i got from advanced script of nprtool. In the code there option only to use p and t in the training and data division but I want to put train , test & validation matrix of input and target seperately in the train function . How can modify the advanced script to do that ?

Thanks a lot.

"Md" wrote in message <l1v3jd$dsb$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <l1tfem$pss$1@newscl01ah.mathworks.com>...
> > "Md" wrote in message <l1o3qo$6f9$1@newscl01ah.mathworks.com>...
> > >
> > > I am doing classification of 2 class using MLP NN in MATLAB.
> > > I am worried about the following items:
> > > a) In the program , How it can Initial weight and bias of input and hidden layer be assign manually ? I am getting some abnormal/unconsistent result.
> >
> > net.IW = IW0
> > net.LW = LW0
> > net.b = b0
> >
> > However, I do not recommend this. The automatic random assignments usually suffice.
> > I always design 10 nets with different random initial weights to mitigate the
> > occasional bad random start. As the number of hidden nodes increases, the percentage of bad starting points should diminish.
> >
> > > b) How i can plot MSE vs no iteraton up to 1000. I want see MSE is constant even after converge at certain iteration.?
> >
> > [ net tr ] =train(net,x,t);
> > tr=tr % No semicolon just to see what training record goodies are available
> > figure(plotnumber)
> > % for epochs 0:M
> > subplot(311)
> > plot(1:N+1, tr.perf(1:M+1); % training
> > subplot(312)
> > plot(1:N+1, tr.vperf(1:M+1); % validation
> > subplot(313)
> > plot(1:N+1, tr.tperf(1:M+1); % testing
> >
> > > c) How I can set optimum number of hidden neuron in the hidden layer?
> >
> > If you have enough data, use many more training equations, than unknown weights
> > Otherwise, use a validation set or regularization.
> >
> > http://www.mathworks.com/matlabcentral/newsreader/view_thread/331662#911472
> >
> > > d) Also, How I can do the 10 fold cross validation for the above data to get optimum error? Is it neccessary to do 10 fold cross validation in NN toolbox MATLAB like other classifier ? anybody has clues/videos/ideas?
> >
> > There is no XVAL function in the toolbox. You could try to code one using CROSSVAL or CVPARTITION. However, I just create multiple designs using random weight divisions and random weight initializations until the estimates of the running mean and standard deviation of the MSE stabilizes.
> >
> > Hope this helps.
> >
> > Greg
> > > Thanks a lot.
>
> Thank you very much for your reply . I understood a lot since i am new in ANN MATLAB toolbox But still I have the following comments:
> For a) i have written in my code
> net = patternnet(hiddenLayerSize);
> net = configure(net,p,t);
> IW = 0.01*randn(H,I);
> b1 = 0.01*randn(H,1);
> LW = 0.01*randn(O,H);
> b2 = 0.01*randn(O,1);
> net.IW{1,1} = IW;
> net.b{1,1} = b1;
> net.LW{2,1} = LW;
> net.b{2,1} = b2;
> then I have written [net tr Y E] = train(net,p,t ) , Is that okay for initialization of weight and bias ?
>
> for b) plot(1:N+1, tr.perf(1:M+1); % training ; what is N? is that total sample number? Also I dont have tr.perf in MATLAB 2012b. I have trainPerformance = perform(net,trainTargets,outputs)
> valPerformance = perform(net,valTargets,outputs)
> testPerformance = perform(net,testTargets,outputs).
> Please can you write in details with code . I didnot get the said plot.
>
> For d) Is cross validation neccessary/urgent for MATLAB 2012b toolbox? How can I overcome the problem of not to use all of my samples that will be used randomly as data partition. Can you explain elaborately?
>
> Thanks for your cooperation Mr. Heath.

Subject: Problem about getting optimum output in Neural Network MATLAB 2012a

From: Greg Heath

Date: 27 Sep, 2013 16:21:05

Message: 5 of 6

"Md" wrote in message <l1v3jd$dsb$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <l1tfem$pss$1@newscl01ah.mathworks.com>...
> > "Md" wrote in message <l1o3qo$6f9$1@newscl01ah.mathworks.com>...
-----SNIP
> Thank you very much for your reply . I understood a lot since i am new in ANN MATLAB toolbox But still I have the following comments:
> For a) i have written in my code
> net = patternnet(hiddenLayerSize);
> net = configure(net,p,t);
> IW = 0.01*randn(H,I);
> b1 = 0.01*randn(H,1);
> LW = 0.01*randn(O,H);
> b2 = 0.01*randn(O,1);
> net.IW{1,1} = IW;
> net.b{1,1} = b1;
> net.LW{2,1} = LW;
> net.b{2,1} = b2;

>Is that okay for initialization of weight and bias ?

Delete the previous 9 statements. They are unnecessary.

> then I have written [net tr Y E] = train(net,p,t ) ,

ok
 
> for b) plot(1:N+1, tr.perf(1:M+1); % training ; what is N? is that total sample number?

Obvious typo. Cannot have different lengths in a plot statement. Change to M

>Also I dont have tr.perf in MATLAB 2012b.

Yes, you do.

For the 2nd time: after the train statement write, without the semicolon,

tr = tr

and investigate the result.

>I have trainPerformance = perform(net,trainTargets,outputs)
> valPerformance = perform(net,valTargets,outputs)
> testPerformance = perform(net,testTargets,outputs).

Delete the last three statements and investigate the results of tr = tr.

> Please can you write in details with code . I did not get the said plot.

I have posted many code examples in NEWSGROUP and ANSWERS. Search

neural greg patternnet

23 NEWSGROUP hits

91 ANSWERS hits

> For d) Is cross validation neccessary/urgent for MATLAB 2012b toolbox? How can I overcome the problem of not to use all of my samples that will be used randomly as data partition. Can you explain elaborately?

1. Understand the existing defaults: Investigate the output from the command

net = patternnet % NO SEMICOLON

2. Review some of my double loop designs over H and random initializations

neural greg Hub Ntrials

> Thanks for your cooperation Mr. Heath.

Either Prof., Dr., or just plain Greg.

Greg

Subject: Problem about getting optimum output in Neural Network MATLAB 2012a

From: Md

Date: 28 Sep, 2013 00:47:09

Message: 6 of 6

Thank you very much Professor Greg. Really appreciated !! your answers helped me a lot to find out solution of my problems.
"Greg Heath" <heath@alumni.brown.edu> wrote in message <l24b9h$b41$1@newscl01ah.mathworks.com>...
> "Md" wrote in message <l1v3jd$dsb$1@newscl01ah.mathworks.com>...
> > "Greg Heath" <heath@alumni.brown.edu> wrote in message <l1tfem$pss$1@newscl01ah.mathworks.com>...
> > > "Md" wrote in message <l1o3qo$6f9$1@newscl01ah.mathworks.com>...
> -----SNIP
> > Thank you very much for your reply . I understood a lot since i am new in ANN MATLAB toolbox But still I have the following comments:
> > For a) i have written in my code
> > net = patternnet(hiddenLayerSize);
> > net = configure(net,p,t);
> > IW = 0.01*randn(H,I);
> > b1 = 0.01*randn(H,1);
> > LW = 0.01*randn(O,H);
> > b2 = 0.01*randn(O,1);
> > net.IW{1,1} = IW;
> > net.b{1,1} = b1;
> > net.LW{2,1} = LW;
> > net.b{2,1} = b2;
>
> >Is that okay for initialization of weight and bias ?
>
> Delete the previous 9 statements. They are unnecessary.
>
> > then I have written [net tr Y E] = train(net,p,t ) ,
>
> ok
>
> > for b) plot(1:N+1, tr.perf(1:M+1); % training ; what is N? is that total sample number?
>
> Obvious typo. Cannot have different lengths in a plot statement. Change to M
>
> >Also I dont have tr.perf in MATLAB 2012b.
>
> Yes, you do.
>
> For the 2nd time: after the train statement write, without the semicolon,
>
> tr = tr
>
> and investigate the result.
>
> >I have trainPerformance = perform(net,trainTargets,outputs)
> > valPerformance = perform(net,valTargets,outputs)
> > testPerformance = perform(net,testTargets,outputs).
>
> Delete the last three statements and investigate the results of tr = tr.
>
> > Please can you write in details with code . I did not get the said plot.
>
> I have posted many code examples in NEWSGROUP and ANSWERS. Search
>
> neural greg patternnet
>
> 23 NEWSGROUP hits
>
> 91 ANSWERS hits
>
> > For d) Is cross validation neccessary/urgent for MATLAB 2012b toolbox? How can I overcome the problem of not to use all of my samples that will be used randomly as data partition. Can you explain elaborately?
>
> 1. Understand the existing defaults: Investigate the output from the command
>
> net = patternnet % NO SEMICOLON
>
> 2. Review some of my double loop designs over H and random initializations
>
> neural greg Hub Ntrials
>
> > Thanks for your cooperation Mr. Heath.
>
> Either Prof., Dr., or just plain Greg.
>
> Greg

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us