Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
neural network

Subject: neural network

From: srishti

Date: 12 Mar, 2013 14:58:07

Message: 1 of 4

hello,
I am using following code for pattern recognition in neural network,but each time when I run this program accuracy gets changed, can anybody suggest me how to get a constant accuracy?
x=in;
t=tar;
net = patternnet(10)
net = train(net,x,t);
view(net)
y = net(x);
perf = perform(net,t,y)

Subject: neural network

From: Greg Heath

Date: 13 Mar, 2013 01:00:09

Message: 2 of 4

"srishti" wrote in message <khnfpv$bo4$1@newscl01ah.mathworks.com>...
> hello,
> I am using following code for pattern recognition in neural network,but each time when I run this program accuracy gets changed, can anybody suggest me how to get a constant accuracy?
> x=in;
> t=tar;
> net = patternnet(10)
> net = train(net,x,t);
> view(net)
> y = net(x);
> perf = perform(net,t,y)

By default,

1. data is randomly divided into training, validation and test sets according to the ratio 0.7/0.15/0.15
2. initial weights are randomly obtained from a subprogram

The easiest solution is to intialize the RNG at the beginning of the program.

Hope this helps.

Greg

Subject: neural network

From: srishti

Date: 13 Mar, 2013 13:44:07

Message: 3 of 4

Thanks for your help. But can you please tell me how to do that? I tried out following code but it is still not working.

S1=1; % numbe of hidden layers
S2=2; % number of output layers (= number of classes)
[R,Q]=size(x);
a=0.3; % define the range of random variables
b=-0.3;
W1=a + (b-a) *rand(S1,R); % Weights between Input and Hidden Neurons
W2=a + (b-a) *rand(S2,S1); % Weights between Hidden and Output Neurons
b1=a + (b-a) *rand(S1,1); % Weights between Input and Hidden Neurons
b2=a + (b-a) *rand(S2,1);
x=mammoin;t=mammotar;
net = patternnet(10)
net = train(net,x,t);
view(net)
y = net(x);
perf = perform(net,t,y)

Subject: neural network

From: Greg Heath

Date: 14 Mar, 2013 16:46:13

Message: 4 of 4

"srishti" wrote in message <khpvr7$qiu$1@newscl01ah.mathworks.com>...
> Thanks for your help. But can you please tell me how to do that?

Initialize the rng only once in the program before

1. Data division
2. Weight initialization

You don't have to know exactly where these occur. Just initialize the
RNG at the beginning of the program.

See

help rng
doc rng

Hope this helps.

Greg

P.S. Data division and Weight intialization are performed at different points
in different NNTBX versions.

1. In very old versions random weights are automatically assigned at net creation.
However, you have to do the data division yourself.

net = newff( minmax(x), [H O] );

2. In the latest obsolete version, (obsoleted in R2010b NNET 7.0), NEWFF and
special forms (NEWFIT for regregession & curvefitting ; NEWPR for classification
& pattern recognition) BOTH weight initialization and data division are automatically
performed at net creation:
 
net = newpr( x, t, H );

help newpr
doc newpr

3. In the current version, special forms of FEEDFORWARDNET(FITNET & PATTERNNET)
can be used. However, NEITHER initial weight initialization NOR data division are automatically performed at net creation. By default BOTH automatically occur at the FIRST call of TRAIN. However,

a. If you are looping over multiple candidate designs, weights will not reinitialize at subsequent calls of TRAIN. Therefore, it is best to explicitly initialize weights using CONFIGURE before calling TRAIN. Unless, for some reason, you want to continue the same design.

b. I don't think that subsequent calls of TRAIN in a loop will automatically redivide the data. Although this is not critical when searching for well performing weight confiurations, I will check and reply if I am wrong.

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us