Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Threads in Neural Network in Train/Validation/Test

Subject: Threads in Neural Network in Train/Validation/Test

From: Subodh Paudel

Date: 21 Feb, 2013 20:07:08

Message: 1 of 2

Dear All,
I am using MATLAB R2009a. I have a different answer for train/validation/test from two different method:

1) I Use:
[net tr] = train(net,trainV.P,trainV.T,[],[],valV,testV);

 to train the network, and simulate the different train/validation/test result as:

normTrainOutput=sim(net,trainV.P,[],[],trainV.T);
normValidateOutput=sim(net,valV.P,[],[],valV.T);
normTestOutput=sim(net,testV.P,[],[],testV.T);

and then i obtained MSE for training validation and test as:

MSETrain=tr.perf(end);
MSEValidate=tr.vperf(end);
MSETest=tr.tperf(end);

And from them finally R2 square value as:

R2Train=1-NMSETrain
R2Validate=1-NMSEValidate
R2Test=1-NMSETest

And the result i obtained directly from MSEtrain1=mse(normTrainOutput - tn(:XX)), that starts from training interval period i defined. And so on validation and test. Why these two values MSETrain1 and MSETrain differ?

2) I have R2 Train = 0.7738, R2 Validate = 0.7934 and R2 Test = 0.7926. And from the linear regression plot i obtain R train = 0.89584, R validate = 0.81805 and R Test = 0.92432. Does it mean the R2 value of neural network is worst than linear regression model? OR the result i obtained during training = 0.89584 from regression is quite good.

3) Every times i simulate my network, my R2 values sometimes good and sometimes even worst -ve. How to make it constant, if i assume i get 27 epochs, hidden neurons =18 the best R2 value?

Thank You.

Subject: Threads in Neural Network in Train/Validation/Test

From: Greg Heath

Date: 22 Feb, 2013 01:14:14

Message: 2 of 2

"Subodh Paudel" <subodhpaudel@gmail.com> wrote in message <kg5upc$pgj$1@newscl01ah.mathworks.com>...
> Dear All,
> I am using MATLAB R2009a. I have a different answer for train/validation/test from two different method:
>
> 1) I Use:
> [net tr] = train(net,trainV.P,trainV.T,[],[],valV,testV);
>
> to train the network, and simulate the different train/validation/test result as:
>
> normTrainOutput=sim(net,trainV.P,[],[],trainV.T);
> normValidateOutput=sim(net,valV.P,[],[],valV.T);
> normTestOutput=sim(net,testV.P,[],[],testV.T);

Using norm in the output names is confusing because norm has a special meaning
(help/doc norm)
>
> and then i obtained MSE for training validation and test as:
>
> MSETrain=tr.perf(end);
> MSEValidate=tr.vperf(end);
> MSETest=tr.tperf(end);

I think if tr.stop indicates validation minimum stopping you should
replace end with end- tr.max_fail or tr.best_epoch.

> And from them finally R2 square value as:
>
> R2Train=1-NMSETrain
> R2Validate=1-NMSEValidate
> R2Test=1-NMSETest
>
> And the result i obtained directly from MSEtrain1=mse(normTrainOutput - tn(:XX)), that starts from training interval period i defined. And so on validation and test. Why these two values MSETrain1 and MSETrain differ?

If tr.stop indicates validation minimum stopping, then the last max_fail epochs should not be included. Find tr.best_epoch ,tr.best_perf, etc
 
> 2) I have R2 Train = 0.7738, R2 Validate = 0.7934 and R2 Test = 0.7926. And from the linear regression plot i obtain R train = 0.89584, R validate = 0.81805 and R Test = 0.92432. Does it mean the R2 value of neural network is worst than linear regression model? OR the result i obtained during training = 0.89584 from regression is quite good.

If you had chosen the val minimum epoch, I would have expected

R = sqrt( R^2 )
>
> 3) Every times i simulate my network, my R2 values sometimes good and sometimes even worst -ve. How to make it constant, if i assume i get 27 epochs, hidden neurons =18 the best R2 value?

You get different values because of the random data division and random weight initialization. If you intialize the random number generator to the same state (e.g.,
rng(4151941) ) before data division and weight initialization, you will reproduce runs.

I usually use a double loop over numH candidate values for H and Ntrials weight initialization runs to get five Ntrials X numH sized matrices for numepochs, R2trn,
R2trna, R2val and R2tst.

Search NEWSGROUP and ANSWERS for greg Ntrials (or other of my characteristic
variable names MSE00, Neq, Ntrneq, Nw, Hub, R2, R2a,...)

Hope this helps.

Greg

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us