Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Neural Network Square Root Estimation

Subject: Neural Network Square Root Estimation

From: Greg Heath

Date: 16 Mar, 2014 18:28:13

Message: 1 of 2

The following code was developed to respond to

http://www.mathworks.com/matlabcentral/answers/121075-neural-network-applied-to-compute-square-root

close all, clear all, clc, plt= 0, tic

% TRAINING AND OPERATIONAL TEST DATA

 ttrn = [0:0.5:10]; % Training Target
 xtrn = ttrn.^2; % Training Input
 [ I Ntrn ] = size(xtrn) % [ 1 21 ]
 [O Ntrn ] = size(ttrn) % [ 1 21 ]
 Ntrneq = Ntrn*O % 21 Training Equations

 Ntst = 100 % Operational Test Data
 xtst = 100*rand(1,Ntst);
 ttst = sqrt(xtst);

 plt = plt+1; figure(plt), hold on
 plot(xtrn,ttrn,'bo','LineWidth',2)
 plot(xtst,ttst,'r.','LineWidth',2)
 legend( 'TRAIN', 'OPERATIONAL TEST' ,2)
 xlabel( ' INPUT ' )
 ylabel( ' TARGET ' )
 title( ' SQUARE ROOT DATA FOR NN MODEL ' )

% MSE References are obtained from the Naive Constant Output
% Model: output = mean(target). However, need a
% Degree-of-Freedom Bias Adjustment (DOFBA) when estimating
% with training data

% Biased reference MSEtrn (Ntrn divisor)

 MSEtrn00 = var(ttrn',1) % 9.17 ; mean(ttrn) = 5

% DOFBA reference MSEtrn (Ntrn-1 divisor)

 MSEtn00a = var(ttrn',0) % 9.63

% Unbiased reference MSEtst (Nst divisor)

 MSEtst00 = var(ttst',1) % 3.95 ; mean(ttst) = 5

% NN with I-H-O topology:
% No. of unknown weights :Nw = (I+1)*H+(H+1)*O
% Therefore Ntrneq >= Nw when H <=Hub where

 Hub = -1+ceil( (Ntrneq-O)/(I+O+1)) % 6
 Ntrials = 10 % Mitigate chances of poor initial random weights

 rng('default')
 for j = 1:Hub
    h = j;
    net = fitnet(h);
    net.divideFcn = 'dividetrain';% No val or tst
    Nw = (I+1)*h+(h+1)*O;
    Ndof = Ntrneq - Nw; % No. of estimation DOF
    % TRAINING GOAL: MSEtrna <= 0.01*MSEtrn00a
    net.trainParam.goal = Ndof*MSEtrn00a/Ndof;
    for i = 1:Ntrials
        net = configure(net,x,t);
        [ net tr ytrn etrn ] = train(net,xtrn,ttrn);
        MSEtrn = mse(etrn);
        MSEtrna = Ntrneq*MSEtrn/Ndof; % DOFBA
        R2trn(i,j) = 1-MSEtrn/MSEtrn00;
        R2trna(i,j) = 1-MSEtrna/MSEtrn00a;
        
        ytst = net(xtst);
        etst = ttst-ytst;
        R2tst(i,j) = 1-mse(etst)/MSEtst00;
    end
end
result = [ R2trn R2trna R2tst ]
R2max = max(result);
R2med = median(result);
R2mean = mean(result);
R2min = min(result);

 plt=plt+1,figure(plt), hold on
plot(1:6, R2max(1:6), 'r', 'LineWidth',2)
plot(1:6, R2med(1:6), 'k', 'LineWidth',2 )
plot(1:6, R2mean(1:6),'b', 'LineWidth',2)
plot(1:6, R2min(1:6), 'g', 'LineWidth',2)
legend( ' R2max ', ' R2med ', ' R2mean ', ' R2min ' )
axis([ 0 7 0 1.4])
xlabel( ' NO. OF HIDDEN NODES ' )
ylabel( ' TRAINING SET R ^2 ' )
title( ' TRAINING SET SUMMARY STATISTICS ' )

 plt=plt+1,figure(plt), hold on
plot(1:6, R2max(7:12), 'r', 'LineWidth',2)
plot(1:6, R2med(7:12), 'k', 'LineWidth',2 )
plot(1:6, R2mean(7:12),'b', 'LineWidth',2)
plot(1:6, R2min(7:12), 'g', 'LineWidth',2)
legend( ' R2max ', ' R2med ', ' R2mean ', ' R2min ' )
axis([ 0 7 0 1.4 ])
xlabel( ' NO. OF HIDDEN NODES ' )
ylabel( ' DOF ADJUSTED TRAINING SET R ^2 ' )
title( ' DOF ADJUSTED TRAINING SET SUMMARY STATISTICS ' )

 plt=plt+1,figure(plt), hold on
plot(1:6, R2max(13:18), 'r', 'LineWidth',2)
plot(1:6, R2med(13:18), 'k', 'LineWidth',2 )
plot(1:6, R2mean(13:18),'b', 'LineWidth',2)
plot(1:6, R2min(13:18), 'g', 'LineWidth',2)
legend( ' R2max ', ' R2med ', ' R2mean ', ' R2min ' )
axis([ 0 7 0 1.4 ])
xlabel( ' NO. OF HIDDEN NODES ' )
ylabel( ' TEST SET R ^2 ' )
title( ' TEST SET SUMMARY STATISTICS ' )

time = toc % ~12 sec

Subject: Neural Network Square Root Estimation

From: Greg Heath

Date: 18 Mar, 2014 02:48:08

Message: 2 of 2

"Greg Heath" <heath@alumni.brown.edu> wrote in message <lg4qft$pge$1@newscl01ah.mathworks.com>...

> % MSE References are obtained from the Naive Constant Output
> % Model: output = mean(target). However, need a
> % Degree-of-Freedom Bias Adjustment (DOFBA) when estimating
> % with training data
>
> % Biased reference MSEtrn (Ntrn divisor)
>
> MSEtrn00 = var(ttrn',1) % 9.17 ; mean(ttrn) = 5
>
> % DOFBA reference MSEtrn (Ntrn-1 divisor)
>
> MSEtn00a = var(ttrn',0) % 9.63

% MISSPELLED

MSEtrn00a= var(ttrn',0) % 9.63
   
> % Unbiased reference MSEtst (Nst divisor)
>
> MSEtst00 = var(ttst',1) % 3.95 ; mean(ttst) = 5

 MSEtst00 = 5.5528
 mean(ttst) = 6.8736

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us