Neural Network applied to compute square root
Show older comments
Hello to the community,
I have recently joined the world of Neural Network, in order to have a better understanding I have done on my own a feed-forward and a backpropagation with momentum.
It looks ok. But now I would like to apply this to not-binary data, I mean for example I would like the NN to learn the square root. I tried to normalize my data in input and output but the results is quite bad.
So if someone could help me with that ..
A = [1;4;9;16;25;36;49;64;81;100]; % Inputs you will used to train the NN
B = [1;2;3;4;5;6;7;8;9;10]; % Outputs you will used to train the NN
RealGuess = [5;1;200;4;8;54;23;15;99;65]; % The data for which you want to used the NN
% Normalization !!
% --------------
A2 = (A(:) - min(A(:)))/max(A(:)-min(A(:)));
B2 = (B(:) - min(B(:)))/max(B(:)-min(B(:)));
numIn = length (A(:)); % Size of the input column from the input file
% you gave. It is the number of neurons will
% used
bias=-1; % bias, threshold
disp('Enter learning rate');
lr=input('learning rate lr='); % learning rate
weights = -1*2.*rand(3,1); % weights, computed randomly for a first guess
disp('Enter iterations');
iterations=input('iterations ='); % Number of iterations before stopping
Error_Tol = 10; % Initial dummy error tolerance
counter = 0; % Initialization of the counter of iterations
disp('The tolerance (condition of stopping');
Tol=input('Tolerance Tol=');
%--------------------------------------------------------------------------
% Training of the NN
%--------------------------------------------------------------------------
% Set up of the plot ..................................................
figure , hold on;
title('Error between the output from the trained NN and the desired output'...
,'fontsize',14);
xlabel('Iterations','fontsize',13);
ylabel('Error','fontsize',13);
% .....................................................................
% The heart of the algo. is just underneath
while(counter<=iterations)&&(Error_Tol>Tol) % Condition of stopping, if
% we reach the maximum of
% iterations AND we reached
% the error tolerance we
% planned.
out = zeros(4,1); % initialization of the output layer
for j = 1:numIn % Loop over all the neurons
y = bias*weights(1,1) + A2(j,1)*weights(2,1);
out(j) = 1/(1+exp(-y));
delta = B2(j)-out(j);
weights(1,1) = weights(1,1)+lr*bias*delta;
weights(2,1) = weights(2,1)+lr*A2(j,1)*delta;
end
% Error computation
Error =((out - B)); % error for the 4 neurons
plot(counter, Error,'o'); % plot them
Error_Tol= 0.5 * (Error_Tol + sum(Error.^2)); % compute the average of
% the error tolerance
% .....................................................
counter = counter + 1; % counter used to count the number of iteration,
% we use it for evaluate the while condition.
end
disp('Training......................................DONE');
%--------------------------------------------------------------------------
% Use of the Neural Network
% -------------------------
% We use the weights computed previously (which means the trained NN) with
% a new set of input data, which will give to us an output data.
%--------------------------------------------------------------------------
trained = zeros(4,1);
for j = 1:numIn
train = bias*weights(1,1)+...
RealGuess(j,1)*weights(2,1);
trained(j) = 1/(1+exp(-train));
end
% -------------------------------------------------------------------------
% Various Display at the end of the RUN
% -------------------------------------------------------------------------
%A = load ('InputAndExpectedOutput.txt');
input_data = A(:);
disp('Using of the trained Neural Network...........DONE');
disp('Results/Outputs for the real guess via the trained Neural Network:');
trained
%trainedNorm=2*trained
% Normalization in the other way around
% -------------------------------------
trainedNorm = (max(input_data(:))- min(input_data(:)))*trained + ...
min(input_data(:));
trainedNorm
3 Comments
Greg Heath
on 12 Mar 2014
Edited: Greg Heath
on 12 Mar 2014
You didn't tell us whether or not you have the Neural Net Toolbox.
NNs can be excellent interpolators but terrible extrapolators.
Since training inputs are in [1,100] don't expect a decent answer from 200.
Did this code work on binary data?
Greg Heath
on 16 Mar 2014
To use the NN TBX see
Accepted Answer
More Answers (1)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!