http://www.mathworks.com/matlabcentral/newsreader/view_thread/327472
MATLAB Central Newsreader  neural network
Feed for thread: neural network
enus
©19942015 by MathWorks, Inc.
webmaster@mathworks.com
MATLAB Central Newsreader
http://blogs.law.harvard.edu/tech/rss
60
MathWorks
http://www.mathworks.com/images/membrane_icon.gif

Tue, 12 Mar 2013 14:58:07 +0000
neural network
http://www.mathworks.com/matlabcentral/newsreader/view_thread/327472#899817
srishti
hello,<br>
I am using following code for pattern recognition in neural network,but each time when I run this program accuracy gets changed, can anybody suggest me how to get a constant accuracy?<br>
x=in;<br>
t=tar;<br>
net = patternnet(10)<br>
net = train(net,x,t);<br>
view(net)<br>
y = net(x);<br>
perf = perform(net,t,y)

Wed, 13 Mar 2013 01:00:09 +0000
Re: neural network
http://www.mathworks.com/matlabcentral/newsreader/view_thread/327472#899869
Greg Heath
"srishti" wrote in message <khnfpv$bo4$1@newscl01ah.mathworks.com>...<br>
> hello,<br>
> I am using following code for pattern recognition in neural network,but each time when I run this program accuracy gets changed, can anybody suggest me how to get a constant accuracy?<br>
> x=in;<br>
> t=tar;<br>
> net = patternnet(10)<br>
> net = train(net,x,t);<br>
> view(net)<br>
> y = net(x);<br>
> perf = perform(net,t,y)<br>
<br>
By default, <br>
<br>
1. data is randomly divided into training, validation and test sets according to the ratio 0.7/0.15/0.15<br>
2. initial weights are randomly obtained from a subprogram<br>
<br>
The easiest solution is to intialize the RNG at the beginning of the program.<br>
<br>
Hope this helps.<br>
<br>
Greg

Wed, 13 Mar 2013 13:44:07 +0000
Re: neural network
http://www.mathworks.com/matlabcentral/newsreader/view_thread/327472#899930
srishti
Thanks for your help. But can you please tell me how to do that? I tried out following code but it is still not working.<br>
<br>
S1=1; % numbe of hidden layers<br>
S2=2; % number of output layers (= number of classes)<br>
[R,Q]=size(x); <br>
a=0.3; % define the range of random variables<br>
b=0.3;<br>
W1=a + (ba) *rand(S1,R); % Weights between Input and Hidden Neurons<br>
W2=a + (ba) *rand(S2,S1); % Weights between Hidden and Output Neurons<br>
b1=a + (ba) *rand(S1,1); % Weights between Input and Hidden Neurons<br>
b2=a + (ba) *rand(S2,1);<br>
x=mammoin;t=mammotar;<br>
net = patternnet(10)<br>
net = train(net,x,t);<br>
view(net)<br>
y = net(x);<br>
perf = perform(net,t,y)

Thu, 14 Mar 2013 16:46:13 +0000
Re: neural network
http://www.mathworks.com/matlabcentral/newsreader/view_thread/327472#900030
Greg Heath
"srishti" wrote in message <khpvr7$qiu$1@newscl01ah.mathworks.com>...<br>
> Thanks for your help. But can you please tell me how to do that? <br>
<br>
Initialize the rng only once in the program before <br>
<br>
1. Data division<br>
2. Weight initialization<br>
<br>
You don't have to know exactly where these occur. Just initialize the <br>
RNG at the beginning of the program. <br>
<br>
See<br>
<br>
help rng<br>
doc rng<br>
<br>
Hope this helps.<br>
<br>
Greg<br>
<br>
P.S. Data division and Weight intialization are performed at different points <br>
in different NNTBX versions. <br>
<br>
1. In very old versions random weights are automatically assigned at net creation. <br>
However, you have to do the data division yourself.<br>
<br>
net = newff( minmax(x), [H O] );<br>
<br>
2. In the latest obsolete version, (obsoleted in R2010b NNET 7.0), NEWFF and <br>
special forms (NEWFIT for regregession & curvefitting ; NEWPR for classification <br>
& pattern recognition) BOTH weight initialization and data division are automatically <br>
performed at net creation:<br>
<br>
net = newpr( x, t, H );<br>
<br>
help newpr<br>
doc newpr<br>
<br>
3. In the current version, special forms of FEEDFORWARDNET(FITNET & PATTERNNET)<br>
can be used. However, NEITHER initial weight initialization NOR data division are automatically performed at net creation. By default BOTH automatically occur at the FIRST call of TRAIN. However, <br>
<br>
a. If you are looping over multiple candidate designs, weights will not reinitialize at subsequent calls of TRAIN. Therefore, it is best to explicitly initialize weights using CONFIGURE before calling TRAIN. Unless, for some reason, you want to continue the same design.<br>
<br>
b. I don't think that subsequent calls of TRAIN in a loop will automatically redivide the data. Although this is not critical when searching for well performing weight confiurations, I will check and reply if I am wrong.