Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Neural Networks weights and bias help

Subject: Neural Networks weights and bias help

From: hasan batuk

Date: 13 Aug, 2010 14:47:05

Message: 1 of 11

Hi, i am trying to learn NN toolbox. I tried to creat a networks that multiply the number by 3 and gives as output. For simplicity, i made the transfer function also linear. It works well but i tried to reach the same result by using the bias and weight values.
x -> input
y-> output

a= [ (w1*x + b1) * w2 ]+b2

but it ends up with same as the input value. I am really confused about it. it looks very simple but i couldnt find what i miss. Thanks for your time. the code is below


P=[1:4:200]; % training set
T=P*3; % target set

net=newff(P,T,1);
net.layers{1}.transferFcn = 'purelin'; % making transfer func. as linear
          
net=train(net,P,T);

y=sim(net,101) % giving a number say 101 as an input to system and taking 303 as output as expected.

% trying to reach same result by using weight and bias values. it ends up with 101, not 303.

a1=(net.iw{1,1}*101)+net.b{1}; % output of first layer

a2=(net.lw{2,1}*a1)+net.b{2} % output of second layer

Subject: Neural Networks weights and bias help

From: Greg Heath

Date: 14 Aug, 2010 00:20:04

Message: 2 of 11

On Aug 13, 10:47 am, "hasan batuk" <kapadokya...@yahoo.com> wrote:
> Hi, i am trying to learn NN toolbox. I tried to creat a networks that multiply the number by 3 and gives as output. For simplicity, i made the transfer function also linear. It works well but i tried to reach the same result by using the bias and weight values.
> x ->  input
> y-> output
>
> a= [ (w1*x + b1) * w2 ]+b2
>
> but it ends up with same as the input value. I am really confused about it. it looks very simple but i couldnt find what i miss. Thanks for your time. the code is below
>
> P=[1:4:200]; % training set
> T=P*3;       % target set
>
> net=newff(P,T,1);
> net.layers{1}.transferFcn = 'purelin';   % making transfer func. as linear

Can include this in the previous command.

> net=train(net,P,T);
>
> y=sim(net,101)   % giving a number say 101 as an input to system and taking 303 as output as expected.
>
> % trying to reach same result by using weight and bias values. it ends up with 101, not 303.
>
> a1=(net.iw{1,1}*101)+net.b{1};  % output of first layer
>
> a2=(net.lw{2,1}*a1)+net.b{2}    % output of second layer

help newff

Carefully read the explanations of IPF and OPF

Hope this helps.

Greg

Subject: Neural Networks weights and bias help

From: hasan batuk

Date: 15 Aug, 2010 17:52:04

Message: 3 of 11

Greg Heath <heath@alumni.brown.edu> wrote in message <a3167516-88f8-48ef-9669-f5a51bf249d5@w30g2000vbs.googlegroups.com>...
> On Aug 13, 10:47 am, "hasan batuk" <kapadokya...@yahoo.com> wrote:

Thanks for the reply Greg, but i still couldnt find why it gives the same number as output when i calculate it by using weights and bias.
At first i thought it is cus of mapmin function, but i used an input and target set between [-1, 1], but still get the same result. What else can u suggest me?
Thanks

Subject: Neural Networks weights and bias help

From: Greg Heath

Date: 16 Aug, 2010 00:57:51

Message: 4 of 11

On Aug 15, 1:52 pm, "hasan batuk" <kapadokya...@yahoo.com> wrote:
> Greg Heath <he...@alumni.brown.edu> wrote in message <a3167516-88f8-48ef-9669-f5a51bf24...@w30g2000vbs.googlegroups.com>...
> > On Aug 13, 10:47 am, "hasan batuk" <kapadokya...@yahoo.com> wrote:
>
> Thanks for the reply Greg, but i still couldnt find why it gives the same number as output when i calculate it by using weights and bias.
> At first i thought it is cus of mapmin function, but i used an input and target set between [-1, 1], but still get the same result. What else can u suggest me?
> Thanks

disable IPF and OPF.

Greg

Subject: Neural Networks weights and bias help

From: hasan batuk

Date: 16 Aug, 2010 18:51:04

Message: 5 of 11


> disable IPF and OPF.
>
> Greg

How can i disable them? i tried to use [] as functions but it gave errors.
Thanks.

Subject: Neural Networks weights and bias help

From: Greg Heath

Date: 17 Aug, 2010 05:30:15

Message: 6 of 11

On Aug 16, 2:51 pm, "hasan batuk" <kapadokya...@yahoo.com> wrote:
> > disable IPF and OPF.
>
> > Greg
>
> How can i disable them? i tried to use [] as functions but it gave errors.
> Thanks.

I can't help you with that. I am still using the obsolete version
without
those bells and whistles.

Sorry.

Ask one of the MATLAB crew who post on NNs.

Greg

Subject: Neural Networks weights and bias help

From: hasan batuk

Date: 17 Aug, 2010 14:33:06

Message: 7 of 11

Thanks, Greg.
Actually what i am trying to do is to find the outputs after the hidden layer, before they go in to output layer.
So i thought if i know weight and bias, i can calculate it. no other methods come to my mind, can u suggest me a better way to take the outputs of hidden layer before they get in to output layer? Does matlab have a specific function or sth?
Thanks again

Subject: Neural Networks weights and bias help

From: Greg Heath

Date: 18 Aug, 2010 02:57:23

Message: 8 of 11

On Aug 17, 10:33 am, "hasan batuk" <kapadokya...@yahoo.com> wrote:
> Thanks, Greg.
> Actually what i am trying to do is to find the outputs after the hidden layer, before they go in to output layer.
> So i thought if i know weight and bias, i can calculate it. no other methods come to my mind, can u suggest me a better way to take the outputs of hidden layer before they get in to output layer? Does matlab have a specific function or sth?

No.

You will have to take the weights generated and construct a new net
with only the first layer.

You still have to solve your previous problem.
Either
find out how to disable maxmin
or
find out how to include the effect of maxmin in your hand
calculations.

If you fail, perhaps posting details on the alternatives you tried
will result in a knowledgable response.

Hope this helps.

Greg

Subject: Neural Networks weights and bias help

From: Renan

Date: 12 Dec, 2012 14:58:07

Message: 9 of 11

Please, I´m in the same situation. Did you figure it out?

Subject: Neural Networks weights and bias help

From: Greg Heath

Date: 13 Dec, 2012 13:23:08

Message: 10 of 11

"Renan" wrote in message <kaa61v$hau$1@newscl01ah.mathworks.com>...
> Please, I´m in the same situation. Did you figure it out?

What EXACTLY do you want to do?

What version of MATLAB do you have?

Please post the documentation you get from the command

help newff

Greg

Subject: Neural Networks weights and bias help

From: Murugan Solaiyappan

Date: 20 Dec, 2012 09:54:08

Message: 11 of 11

> What EXACTLY do you want to do?

I have one year of stock data.
My 12 input is fuzzified data
My 12 output also fuzzified data.
I want to compare the result from original data with neural network result. and also i want to predict the stock data.
 
> What version of MATLAB do you have?
Matlab version: MATLAB Version 7.10.0.499 (R2010a)
> Please post the documentation you get from the command
>
> help newff
NEWFF Create a feed-forward backpropagation network.
 
   Syntax
 
     net = newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF)
 
   Description
 
     NEWFF(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) takes,
       P - RxQ1 matrix of Q1 representative R-element input vectors.
       T - SNxQ2 matrix of Q2 representative SN-element target vectors.
       Si - Sizes of N-1 hidden layers, S1 to S(N-1), default = [].
             (Output layer size SN is determined from T.)
       TFi - Transfer function of ith layer. Default is 'tansig' for
             hidden layers, and 'purelin' for output layer.
       BTF - Backprop network training function, default = 'trainlm'.
       BLF - Backprop weight/bias learning function, default = 'learngdm'.
       PF - Performance function, default = 'mse'.
       IPF - Row cell array of input processing functions.
             Default is {'fixunknowns','removeconstantrows','mapminmax'}.
       OPF - Row cell array of output processing functions.
             Default is {'removeconstantrows','mapminmax'}.
       DDF - Data division function, default = 'dividerand';
     and returns an N layer feed-forward backprop network.
 
     The transfer functions TF{i} can be any differentiable transfer
     function such as TANSIG, LOGSIG, or PURELIN.
 
     The training function BTF can be any of the backprop training
     functions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc.
 
     *WARNING*: TRAINLM is the default training function because it
     is very fast, but it requires a lot of memory to run. If you get
     an "out-of-memory" error when training try doing one of these:
 
     (1) Slow TRAINLM training, but reduce memory requirements, by
         setting NET.trainParam.mem_reduc to 2 or more. (See HELP TRAINLM.)
     (2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM.
     (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG.
 
     The learning function BLF can be either of the backpropagation
     learning functions such as LEARNGD, or LEARNGDM.
 
     The performance function can be any of the differentiable performance
     functions such as MSE or MSEREG.
 
   Examples
 
     load simplefit_dataset
     net = newff(simplefitInputs,simplefitTargets,20);
     net = train(net,simplefitInputs,simplefitTargets);
     simplefitOutputs = sim(net,simplefitInputs);
 
   Algorithm
 
     Feed-forward networks consist of Nl layers using the DOTPROD
     weight function, NETSUM net input function, and the specified
     transfer functions.
 
     The first layer has weights coming from the input. Each subsequent
     layer has a weight coming from the previous layer. All layers
     have biases. The last layer is the network output.
 
     Each layer's weights and biases are initialized with INITNW.
 
     Adaption is done with TRAINS which updates weights with the
     specified learning function. Training is done with the specified
     training function. Performance is measured according to the specified
     performance function.
 
   See also newcf, newelm, sim, init, adapt, train, trains

    Reference page in Help browser
       doc newff
>
> Greg

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us