Help in the backpropagation code and configuration network

2 views (last 30 days)
Hello, I am relatively new to the area of Neural Networks and am using the toolbox of Matlab 2008a.
Below is the code that I have where I'm implementing some doubts: I'm considering to use backpropagation and Levenberg Marquadt out so far.
1-I'm having trouble and parameterize the number of times, display, etc..
2-To verify that the network was correct desemepnho just look at the graph or can I look through the MSE?
3-Because the data I have, so I need to normalize the input array as the array output. How do I use the command instead of standardization do manually.
4-Any other comment on that can help me get the best result possible?
a=[0 0 0
0.125 0 0
0.25 0 0
0.35 0 0
0.5 0 0
0.6 0 0
0.75 0 0
0.85 0 0
1 0 0
1.25 0 0
0 5 0
0.125 5 0
0.25 5 0
0.35 5 0
0.5 5 0
0.6 5 0
0.75 5 0
0.85 5 0
1 5 0
1.25 5 0
0 10 0
0.125 10 0
0.25 10 0
0.35 10 0
0.5 10 0
0.6 10 0
0.75 10 0
0.85 10 0
1 10 0
1.25 10 0
0 15 0
0.125 15 0
0.25 15 0
0.35 15 0
0.5 15 0
0.6 15 0
0.75 15 0
0.85 15 0
1 15 0
1.25 15 0
0 20 0
0.125 20 0
0.25 20 0
0.35 20 0
0.5 20 0
0.6 20 0
0.75 20 0
0.85 20 0
1 20 0
1.25 20 0
0 0 0.0689
0.125 0 0.0686
0.25 0 0.069
0.35 0 0.0688
0.5 0 0.0689
0.6 0 0.0689
0.75 0 0.0689
0.85 0 0.0689
1 0 0.0689
1.25 0 0.0689
0 5 0.0686
0.125 5 0.069
0.25 5 0.0689
0.35 5 0.0688
0.5 5 0.0688
0.6 5 0.0689
0.75 5 0.0686
0.85 5 0.0689
1 5 0.0687
1.25 5 0.0689
0 10 0.0688
0.125 10 0.0688
0.25 10 0.0688
0.35 10 0.0688
0.5 10 0.069
0.6 10 0.0689
0.75 10 0.0688
0.85 10 0.0687
1 10 0.0689
1.25 10 0.069
0 15 0.0687
0.125 15 0.0688
0.25 15 0.0688
0.35 15 0.0689
0.5 15 0.0688
0.6 15 0.0688
0.75 15 0.0689
0.85 15 0.0689
1 15 0.0688
1.25 15 0.0688
0 20 0.0688
0.125 20 0.0689
0.25 20 0.0686
0.35 20 0.069
0.5 20 0.069
0.6 20 0.069
0.75 20 0.0689
0.85 20 0.0688
1 20 0.0689
1.25 20 0.0688
0 5 0.1376
0.125 5 0.1375
0.25 5 0.1375
0.35 5 0.138
0.5 5 0.1373
0.6 5 0.138
0.75 5 0.138
0.85 5 0.1379
1 5 0.1378
1.25 5 0.1375
0 10 0.1379
0.125 10 0.1375
0.25 10 0.1375
0.35 10 0.1377
0.5 10 0.1375
0.6 10 0.1375
0.75 10 0.1378
0.85 10 0.1378
1 10 0.138
1.25 10 0.1376
0 15 0.1378
0.125 15 0.1379
0.25 15 0.1375
0.35 15 0.1377
0.5 15 0.1375];
p=a';
b=[1900 10 4000000
1900 20 4000000
1900 40 4000000
1900 60 4000000
1900 80 4000000
1900 90 4000000
1900 120 4100000
10000 120 4100000
17000 140 4100000
32000 220 4100000
1900 10 4000000
1900 20 4000000
1900 40 4000000
1900 60 4000000
1900 80 4000000
1900 90 4000000
3000 110 4100000
10000 120 4100000
17000 140 4100000
32000 220 4100000
1900 10 7000000
1900 20 4000000
1900 40 4000000
1900 40 4000000
1900 60 4000000
1900 70 4000000
1900 130 4000000
7000 130 4000000
16000 130 4000000
300000 180 4000000
1900 10 7000000
1900 10 7000000
2200 10 7000000
1900 60 7000000
1900 80 7000000
1900 80 7000000
1900 150 7000000
1900 150 7200000
16000 150 6500000
340000 240 4800000
1900 20 6500000
1000 20 5000000
1900 20 6200000
1900 20 6200000
1900 20 6000000
1900 40 5800000
9500 40 5800000
30000 40 5800000
80000 50 4500000
120000 40 4500000
3000 80 4000000
3000 80 4000000
3000 80 4000000
3800 80 4000000
4200 100 4000000
5000 100 4000000
6000 130 4200000
14000 140 4200000
22000 150 4200000
45000 280 4300000
3000 80 4000000
3000 80 4000000
3000 80 4000000
3000 80 4000000
3500 100 4000000
6000 100 4000000
10000 150 4000000
12000 150 4000000
22000 150 4000000
32000 200 4000000
2000 40 4000000
2000 40 4000000
3000 40 4000000
3000 60 4000000
5000 80 4000000
7000 80 4000000
15000 100 4000000
15000 120 4000000
20000 140 4000000
32000 180 4000000
16000 80 4000000
16000 80 4000000
16000 80 4000000
16000 80 4000000
16000 80 4000000
18000 80 4000000
18000 90 4000000
18000 90 4000000
24000 120 4000000
32000 160 4000000
22000 100 4000000
22000 100 4000000
22000 100 4000000
22000 100 4000000
24000 105 4000000
25000 105 4000000
38000 105 4000000
38000 105 4000000
38000 105 4000000
38000 120 4000000
3400 90 4000000
3800 90 4000000
4600 110 4000000
5000 120 4000000
5000 160 4000000
9000 260 4600000
10000 260 4600000
15000 260 4800000
21500 290 4900000
21500 290 4900000
3400 90 4000000
3400 90 4000000
4000 110 4000000
5000 120 4000000
9000 120 4000000
12000 140 4000000
165000 165 4150000
350000 170 4250000
400000 170 4250000
180000 170 4300000
51152.52101 146.17507 4000000
51567.76929 146.8859527 4000000
51983.01757 147.5968354 4000000
52398.26585 148.3077181 4000000
52813.51413 149.0186008 4000000];
t=b';
net =newff(p,t,12,{'logsig','logsig'},'trainlm');
%
net.trainParam.show = 5;
net.trainParam.epochs = 30000;
net.trainParam.goal = 0.005;
net.trainParam.lr = 0.0005;
net.trainParam.mc = 0.98
[net,tr]=train(net,p,t);
plotperform(tr)
y = sim(net,p)
e = t-y
perf = mse(e)
  2 Comments
Greg Heath
Greg Heath on 4 Nov 2011
What problem are you trying to solve?
What are the inputs? What are the outputs?
What are the sizes of a and b?
You need to normalize input and output.
Why 12 hidden nodes?
Why don't you use the default values for training?
Hope this helps.
Greg
Rafa
Rafa on 5 Nov 2011
The problem is to determine the gains of a PID controller.
The entry (p = a ') consists of a system of environmental conditions and the output (t = b') values found for PID tuning, making the system stable.
The matrix a and b are the size of 3x125, and are three variables for both input and output and 125 examples.
The problem does not show linear features then thought logsig the activation function for hidden layer and output. See if performance improves with other functions.
I put 12 neuronions hidden layer, but I will be testing to see the best possible
I am normalizing the input array and output and simulate again.
Could you help me?

Sign in to comment.

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!