try to simulate neural network in Matlab by myself

hi every one, i tried to create a neural network that estimate y = x ^ 2 so i create a fitting neural network and give some sample for input and out put. then i tried to come this network to c++. but the result is different i tried to find why this happened. i wrote this command in matlab :
purelin(net.LW{2}*tansig(net.IW{1}*in+net.b{1})+net.b{2})
and the result was 16.0828 for : in = 3
my network has 2 neurons. and other information are :
net.IW : 0.344272596370387 0.344111217766824
net.LW : 31.7635369693519 -31.8082184881063
b : -1.16610230053776 1.16667147712026
b2 : 51.3266249426358
so is any body have any idea why this has happened? thank you

 Accepted Answer

I used fitnet(2) with all defaults except for
rng(0)
net.trainparam.goal = MSE00/100;
where MSE00 is the mse for the constant mean value (=1704) output model.
It converged resulting in
Nepochs = tr.epoch(end) % 5
NMSEtrn = tr.perf(end)/MSE00 % 0.0018431
NMSEval =tr.vperf(end)/MSE00 % 0.00068653
NMSEtst = tr.tperf(end)/MSE00 % 0.0016467
whereas
y3 = net(3) % 70.409 (instead of 3^2 = 9)
sqerr3 = (y3-3^2)^2 % 3771.1
Nsqerr3 = sqerr3/MSE00 % 0.0016237
So, to get a good feel for how well the net is performing
look at the normalized test set NMSEtst.
Hope this helps.
Greg

3 Comments

close all, clear all, clc, plt=0
format short g
x = -71:71;
t = x.^2;
[I N] =size(x) % [ 1 143 ]
[O N ] = size(t) % [ 1 143 ]
Neq = N*O
plt=plt+1,figure(plt)
subplot(211)
plot(x,t,'k--','LineWidth',2)
ylabel(['Target (black dash)',' Output(red)'])
hold on
y00 = repmat(mean(t,2),1,N) % 1704*ones(1,143)
Nw00 = numel(O) % 1
MSE00 = sse(t-y00) /Neq % 2322552
MSE00a = sse(t-y00)/(Neq-Nw00) % 2338908
W = t/[ones(1,N);x] % [ 1704 7.66e-16 ]
Nw0 = numel(W) % 2
y0 = W*[ones(1,N);x]; % y0 = y00
MSE0 = sse(t-y0) /Neq % 2322552
MSE0a =sse(t-y0)/(Neq-Nw0) % 2355496
R20 = 1-MSE0/MSE00 % 0
R20a = 1-MSE0a/MSE00a % -0.0071
plot(x,y0,'b','LineWidth',2)
hold on
rng(0)
H = 2
Nw =(I+1)*H+(H+1)*O % 7
R2agoal = 0.99
MSEgoal = (Neq-Nw)*MSE00a*(1-R2agoal)/Neq % 22244
net = fitnet(H);
MSEgoal = 0.01*(Neq-Nw)*MSE00/Neq % 22089
net.trainParam.goal = MSEgoal % 23225.52
[net tr] = train(net,x,t); % Using defaults
Nepochs = tr.epoch(end) % 5
NMSEtrn = tr.perf(end)/MSE00 % 0.0018431
NMSEval =tr.vperf(end)/MSE00 % 0.00068653
NMSEtst = tr.tperf(end)/MSE00 % 0.0016467
y = net(x)
e = t-y
plot(x,y,'r','LineWidth',2)
subplot(212)
plot(x,e,'r','LineWidth',2)
ylabel('Error')
thank you, but i think you did'nt understand my question well. i know we must have some pre-process and post-proccess. i have problem in normalize data, especially out put scale out. for input we can use mapminmaxfunction. but i don't have any idea about out put processor.
All of the MLP functions (newff,newfit,newpr,fitnet,patternet,feedforwardnet)
AUTOMATICALLY use mapminmax. Therefore, you do not have to worry about it
unless you either want no normalization or standardization(zero-mean/unit-std)

Sign in to comment.

More Answers (1)

What was the range of x for training? How many input values? What random number seed? What training algorithm? What stopping rule? How many values of hidden nodes did you try? How many trials of weight initialization for each value of H? Where are the tabulations of MSE for training, validation and test sets?
For the design with the lowest MSEval, Tabulate x, t, y, e=t-y.
For useful examples, search
heath newff close clear Ntrials
Use fitnet instead of the obsolete newff.

3 Comments

thank you for your response! i try to explain what i have done: my inputs are :
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 -1 -2 -3 -4 -5 -6 -7 -8 -9 -10 -11 -12 -13 -14 -15 -16 -17 -18 -19 -20 -21 -22 -23 -24 -25 -26 -27 -28 -29 -30 -31 -32 -33 -34 -35 -36 -37 -38 -39 -40 -41 -42 -43 -44 -45 -46 -47 -48 -49 -50 -51 -52 -53 -54 -55 -56 -57 -58 -59 -60 -61 -62 -63 -64 -65 -66 -67 -68 -69 -70 -71
and my outputs are :
0 1 4 9 16 25 36 49 64 81 100 121 144 169 196 225 256 289 324 361 400 441 484 529 576 625 676 729 784 841 900 961 1024 1089 1156 1225 1296 1369 1444 1521 1600 1681 1764 1849 1936 2025 2116 2209 2304 2401 2500 2601 2704 2809 2916 3025 3136 3249 3364 3481 3600 3721 3844 3969 4096 4225 4356 4489 4624 4761 4900 5041 1 4 9 16 25 36 49 64 81 100 121 144 169 196 225 256 289 324 361 400 441 484 529 576 625 676 729 784 841 900 961 1024 1089 1156 1225 1296 1369 1444 1521 1600 1681 1764 1849 1936 2025 2116 2209 2304 2401 2500 2601 2704 2809 2916 3025 3136 3249 3364 3481 3600 3721 3844 3969 4096 4225 4356 4489 4624 4761 4900 5041
then i used fitting tool network. with matrix rows. training is 70% validation is 15% and testing is 15% too. number of hidden neurons is 2. then in command lines i wrote this :
purelin(net.LW{2}*tansig(net.IW{1}*inputTest+net.b{1})+net.b{2})
where inputTest is 3 the result of this command is 16. but as i know it's not correct. it should be about 9. other informations :
my net.b[1] is : -1.16610230053776 1.16667147712026
my net.b[2] is : 51.3266249426358
and net.IW(1) is : 0.344272596370387 0.344111217766824
net.LW(2) is : 31.7635369693519 -31.8082184881063
i think i shared all information for creating some network like me. you can do it yourself. but if it's not complete please aware me. thank you
Why didn't you just enter
input = [0:71,-1:-1:-71} ;
target = input.^2 ; % Reserve "output" for net(input)
N =343
How, exactly, did you obtain the weights?
Post your code.
What is the NMSE = MSE/MSE00 or R^2 =1-NMSE for train, val & test?
In fact i'm a new user in Matlab so i don't have the information that you want, for example i don;t know what is N? i go to tool box of neural network in fitting tool, then i set the inputs and outputs. and then trains the datas. so i don't know how can i find NMSE = MSE/MSE00 or R^2 = 1 - NMSE or others. so Matlab sets the weights. and i found them with net.IW(1) and net.LW(2) thank you

Sign in to comment.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!