Artificial neural network (ANN), trainlm, traingda trainbfg and trainrp

4 views (last 30 days)
Iam master student my studying the performance comparison between four networks training functions methods in Artificial neural network (ANN), trainlm, traingda trainbfg and trainrp I use two test Complex modulus and phase angle I use this cod this cod ------------------------------------------------------------- net=feedforwardnet(10,'trainlm'); net=train(net,x,t); p=sim(net,i); I fund good regression but the problem I faced is the MAE MSE RMSE(error) are not good there is very big different after I compere the actual and the predicted data idont know wich type of ANN i use this sample of my data my data it has big variety i have 1008 sample the numbers start from 3 until 4000000 To be like this and I sort it from smallest to the lager number the first three Columns Are the input( the first one is freq second one temperature third one the angel of twist fourth one the types of petrol )
0.015 10 5 1 1170000 15 35 5 1 1175000 0.1 15 3 1 1200000 2 25 5 2 1205000 0.2 15 5 3 1215000 0.05 10 5 3 1235000 0.02 10 7 2 1270000 0.15 15 3 2 1275000 10 35 7 1 1300000
  1 Comment
Greg Heath
Greg Heath on 3 Feb 2016
Edited: Greg Heath on 3 Feb 2016
Currently not understandable.
Please rewrite and explain posted numbers. How many matrices are you
showing? What are the sizes?

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 3 Feb 2016
Here is a simple comparison of the 4 training functions on the dataset used in the fitnet and feedforwardnet documentation.
close all, clear all, clc
[ x, t ] = simplefit_dataset;
vart = var(t,0) % 8.4274 1-dim Reference MSE
figure, subplot(211), hold on
plot(x), plot(t,'r')
subplot(212), plot( x,t ,'r')
% t vs x shows 4 local extrema. Therefore choose
H=4
net1 = fitnet(H); net1.trainFcn = 'trainlm';
net2 = fitnet(H); net2.trainFcn = 'trainbfg';
net3 = fitnet(H); net3.trainFcn = 'trainrp';
net4 = fitnet(H); net4.trainFcn = 'traingda';
rng(0), [ net1 tr1 y1 e1 ] = train(net1,x,t);
rng(0), [ net2 tr2 y2 e2 ] = train(net2,x,t);
rng(0), [ net3 tr3 y3 e3 ] = train(net3,x,t);
rng(0), [ net4 tr4 y4 e4 ] = train(net4,x,t);
NMSE1 = mse(e1)/vart % 0.00033605
NMSE2 = mse(e2)/vart % 0.02683
NMSE3 = mse(e3)/vart % 0.096204
NMSE4 = mse(e4)/vart % 0.15838
Although it is a nice example, it doesn't prove much because, in addition to default parameter values (e.g, mu and min_grad) the rankings depend on
1. The underlying function t = f(x)
2. Random datadivision
3. Random initial weights
For a serious comparision with nontrivial data, you would need at least tens of repetitions.
Hope this helps.
Greg

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!