poor performance by neural network

2 views (last 30 days)
I started doing Devanagari character recognition .. I had handwritten 50 Devanagari characters by 5 persons..I finished the feature extraction portion .. My features for each image in having 1 column and 55 rows. My feature vectors are arranged like column 1 contains letter 1 feature by writer 1, column 2 contains letter 2 feature by writer 2 . . . . . letter 50 by writer 5.
For training i am taking first 4 writers characters . So the training matrix named PP has 55 rows and 200 columns .. My target I have declared as
Target=[eye(50) eye(50) eye(50) eye(50) ]
Now seeing tutorials i started training the network as below
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize);
net = configure(net,P,T);
net=init(net);
[net,tr] = train(net,P,T);
After doing it I am getting very poor performance =0.018.. On testing the test set PP(:,201:250) i am not getting any proper result.. Am i doing the training wrong .. I am a beginner just started on neural networks .. Thanking you

Accepted Answer

Greg Heath
Greg Heath on 27 Apr 2015
FITNET is for regression and curve-fitting. The performance function is mean-square-error (MSE)
which depends on the scale of the target data. The overall evaluation function is normalized MSE, NMSE = MSE/mean(var(target',1)). Typically, values below 0.01 are acceptable. However, separate calulations should be made for each of the training, validation and test subsets.
PATTERNNET is for classification and pattern-recognition. The performance function is crossentropy. The evaluation function is overall classification error rate. Typically, values below 5% are acceptable. However, separate train/val/test calulations should be made for each class.
Train will automatically configure and initialize new networks. Therefore you should remove those 2 commands.
Data division and weight initialization depend on the state of the random number generator. Therefore, set the RNG state before training so that the designs can be duplicated.
I had difficulty understanding your description. However, it appears that you should be using patternnet with the following data:
[ I N ] = size(input) % [ 55 250 ]
[ O N ] = size(target) % [ 5 250 ]
See the documentation examples
help patternnet
doc patternnet
More realistic classification examples can be found by searching the NEWSGROUP and ANSWERS using
greg patternnet
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 Comments
madhusudan kumar
madhusudan kumar on 29 Apr 2015
I shifted my focus to pattern recognition.. I refered to the Math file exchange and tried to solve my problem.. Here is my code .. I have 6 classes(Alphabets ) written by 3 different writers .. My variables are as follows.
F =[F1 F2 F3 F4 F5 F6] . F is a variable containing F1 ..F1 is A particular character written by 3 people .. size of F1 is 1225 rows and 3 coloumns. Coloumn 1 contain feature vector of charcater 1 written by 1st person. Coloumn 2 contains feature vector of charcater 1 written by 2nd person.. and so on .. F2 - For second character and F3 for 3rd charcater , F4 for 4th, F5 for fifth and F6 for sixth..
clc,clear all
plotFlag = 1;
depth = 6;
no_char=1;
%%Character feature extraction
pathname ='D:\HAND\datbase main\10\'; % before running that the pathname of the files are correct ..
F1=extract_vect1(pathname,depth,plotFlag)
pathname ='D:\HAND\datbase main\11\'; % before running that the pathname of the files are correct ..
F2=extract_vect1(pathname,depth,plotFlag)
pathname ='D:\HAND\datbase main\12\'; % before running that the pathname of the files are correct ..
F3=extract_vect1(pathname,depth,plotFlag)
pathname ='D:\HAND\datbase main\13\'; % before running that the pathname of the files are correct ..
F4=extract_vect1(pathname,depth,plotFlag)
pathname ='D:\HAND\datbase main\14\'; % before running that the pathname of the files are correct ..
F5=extract_vect1(pathname,depth,plotFlag)
pathname ='D:\HAND\datbase main\15\'; % before running that the pathname of the files are correct ..
F6=extract_vect1(pathname,depth,plotFlag)
P=[F1 F2 F3 F4 F5 F6] % training data
N=F2(:,1); % testing data
T=[1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ;
0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 ;
0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0;
0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0;
0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0;
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1]; % Targets
S1=5; % numbe of hidden layers
S2=6; % number of output layers (= number of classes)
[R,Q]=size(P);
epochs = 10000; % number of iterations
goal_err = 10e-3; % goal error
a=10.5; % define the range of random variables
b=-10.5;
W1=a + (b-a) *rand(S1,R); % Weights between Input and Hidden Neurons
W2=a + (b-a) *rand(S2,S1); % Weights between Hidden and Output Neurons
b1=a + (b-a) *rand(S1,1); % Weights between Input and Hidden Neurons
b2=a + (b-a) *rand(S2,1); % Weights between Hidden and Output Neurons
n1=W1*P;
A1=logsig(n1);
n2=W2*A1;
A2=logsig(n2);
e=A2-T;
error =0.5* mean(mean(e.*e));
nntwarn off
for itr =1:epochs
if error <= goal_err
break
else
for i=1:Q
df1=dlogsig(n1,A1(:,i));
df2=dlogsig(n2,A2(:,i));
s2 = -2*diag(df2) * e(:,i);
s1 = diag(df1)* W2'* s2;
W2 = W2-0.1*s2*A1(:,i)';
b2 = b2-0.1*s2;
W1 = W1-0.1*s1*P(:,i)';
b1 = b1-0.1*s1;
A1(:,i)=logsig(W1*P(:,i),b1);
A2(:,i)=logsig(W2*A1(:,i),b2);
end
e = T - A2;
error =0.5*mean(mean(e.*e));
disp(sprintf('Iteration :%5d mse :%12.6f%',itr,error));
mse(itr)=error;
end
end
threshold=0.9; % threshold of the system (higher threshold = more accuracy)
TrnOutput=real(A2>threshold)
% applying test images to NN
n1=W1*N;
A1=logsig(n1);
n2=W2*A1;
A2test=logsig(n2);
% testing images result
%TstOutput=real(A2test)
TstOutput=real(A2test>threshold)
Now problem is after running the code i am not getting a lower mse.. MSE after 10000 epochs is just 0.0678 .. Hence i am not getting correct recognition ..
However same code works well is i use only 3 characters to train .. Problem is arising when i am increasing the number of classes ..
Thanking you
Greg Heath
Greg Heath on 30 Apr 2015
Don't you have the Neural Network Toolbox??

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!