http://www.mathworks.com/matlabcentral/newsreader/view_thread/317462
MATLAB Central Newsreader  difference between numInputs and neurons in inputlayer?
Feed for thread: difference between numInputs and neurons in inputlayer?
enus
©19942015 by MathWorks, Inc.
webmaster@mathworks.com
MATLAB Central Newsreader
http://blogs.law.harvard.edu/tech/rss
60
MathWorks
http://www.mathworks.com/images/membrane_icon.gif

Wed, 29 Feb 2012 18:36:12 +0000
difference between numInputs and neurons in inputlayer?
http://www.mathworks.com/matlabcentral/newsreader/view_thread/317462#868419
preben
I am going to use<br>
net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
to create a custom neural network.<br>
<br>
but I dont understand, what is the meaning of numInputs, and the difference between numInputs and neurons in the input layer.<br>
<br>
does the numlayers include all layers (input layer+hidden layer+output layer)?<br>
any one can explain these?

Fri, 02 Mar 2012 09:04:31 +0000
Re: difference between numInputs and neurons in inputlayer?
http://www.mathworks.com/matlabcentral/newsreader/view_thread/317462#868592
Greg Heath
"preben" wrote in message <jilr6s$8kc$1@newscl01ah.mathworks.com>...<br>
> I am going to use<br>
> net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
> to create a custom neural network.<br>
> <br>
> but I dont understand, what is the meaning of numInputs, and the difference between >numInputs and neurons in the input layer.<br>
> <br>
> does the numlayers include all layers (input layer+hidden layer+output layer)?<br>
> any one can explain these?<br>
<br>
There is a difference between layers of nodes and layers of weights. The term "layer" <br>
in most neural network literature (including MATLAB's "numlayers") refers to weight layers. <br>
<br>
For a typical FFMLP there are 3 node layers (input,hidden,output) but only 2 weight layers (inputhidden and hiddenoutput).<br>
<br>
MATLAB's use of "numinputs" and "numoutputs" are interpreted in the vector sense: <br>
There is one vector input and one vector output<br>
<br>
Hidden and output nodes are associated with activation functions aka artificial neurons <br>
whereas the input nodes are associated with applied signals and are characterized as <br>
"fanin units". To be perfectly clear, there are no neurons in the input layer.<br>
<br>
Example:<br>
<br>
clear all, close all, clc<br>
p = randn(3,100); <br>
t = exp(p).*cos(p); <br>
[ I N ] = size(p) % [ 3 100]<br>
[ O N ] = size(t) % [3 100]<br>
Neq = N*O % 300 No. of training equations<br>
Hub = floor((NeqO)/(I+O+1)) % 42 Neq >= Nw Upper bound of H<br>
H =round(Hub/10) % 4 Neq ~ 10*Nw (want Neq >> Nw)<br>
Nw = (I+1)*H+(H+1)*O % 31<br>
% IHO = 343<br>
net = newff(p,t,H) % No semicolon to display characteristics.<br>
<br>
% Now investigate the contents of the net's dimensions, connections,<br>
% subobjects, weight and bias values.<br>
<br>
Hope this helps.<br>
<br>
Greg

Fri, 02 Mar 2012 16:06:17 +0000
Re: difference between numInputs and neurons in inputlayer?
http://www.mathworks.com/matlabcentral/newsreader/view_thread/317462#868648
preben
"Greg Heath" <heath@alumni.brown.edu> wrote in message <jiq2ev$99v$1@newscl01ah.mathworks.com>...<br>
> "preben" wrote in message <jilr6s$8kc$1@newscl01ah.mathworks.com>...<br>
> > I am going to use<br>
> > net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
> > to create a custom neural network.<br>
> > <br>
> > but I dont understand, what is the meaning of numInputs, and the difference between >numInputs and neurons in the input layer.<br>
> > <br>
> > does the numlayers include all layers (input layer+hidden layer+output layer)?<br>
> > any one can explain these?<br>
> <br>
> There is a difference between layers of nodes and layers of weights. The term "layer" <br>
> in most neural network literature (including MATLAB's "numlayers") refers to weight layers. <br>
> <br>
> For a typical FFMLP there are 3 node layers (input,hidden,output) but only 2 weight layers (inputhidden and hiddenoutput).<br>
> <br>
> MATLAB's use of "numinputs" and "numoutputs" are interpreted in the vector sense: <br>
> There is one vector input and one vector output<br>
> <br>
> Hidden and output nodes are associated with activation functions aka artificial neurons <br>
> whereas the input nodes are associated with applied signals and are characterized as <br>
> "fanin units". To be perfectly clear, there are no neurons in the input layer.<br>
> <br>
> Example:<br>
> <br>
> clear all, close all, clc<br>
> p = randn(3,100); <br>
> t = exp(p).*cos(p); <br>
> [ I N ] = size(p) % [ 3 100]<br>
> [ O N ] = size(t) % [3 100]<br>
> Neq = N*O % 300 No. of training equations<br>
> Hub = floor((NeqO)/(I+O+1)) % 42 Neq >= Nw Upper bound of H<br>
> H =round(Hub/10) % 4 Neq ~ 10*Nw (want Neq >> Nw)<br>
> Nw = (I+1)*H+(H+1)*O % 31<br>
> % IHO = 343<br>
> net = newff(p,t,H) % No semicolon to display characteristics.<br>
> <br>
> % Now investigate the contents of the net's dimensions, connections,<br>
> % subobjects, weight and bias values.<br>
> <br>
> Hope this helps.<br>
> <br>
> Greg<br>
<br>
thanks for your reply.<br>
I understand now.<br>
I have a similar question with one guy who asked several years ago as following<br>
<br>
"I am trying to design a 641 network. The first three input nodes<br>
(i.e 13) are connected with the first two (i.e 12) nodes in the<br>
hidden layer, while the last 3 input nodes (i.e 46) are connected<br>
fully with the last two nodes in the hidden layer (34). . All the<br>
four hidden nodes are connected to the output node. There is no<br>
connection between input nodes (13) and hidden nodes (34) so also<br>
there is no connection between input nodes (46) and hidden nodes<br>
(12)."<br>
<br>
how should I set the parameters of network function?<br>
net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
<br>
I have another question.<br>
can I train the net using [net,TR] = trainlm(net,TR,trainV,valV,testV)?<br>
if so, how should I initialize the parameter of TR?<br>
<br>
thanks in advance

Fri, 02 Mar 2012 23:48:38 +0000
Re: difference between numInputs and neurons in inputlayer?
http://www.mathworks.com/matlabcentral/newsreader/view_thread/317462#868704
Greg Heath
"preben" wrote in message <jiqr5p$qnv$1@newscl01ah.mathworks.com>...<br>
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <jiq2ev$99v$1@newscl01ah.mathworks.com>...<br>
> > "preben" wrote in message <jilr6s$8kc$1@newscl01ah.mathworks.com>...<br>
> > > I am going to use<br>
> > > net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
> > > to create a custom neural network.<br>
> > > <br>
> > > but I dont understand, what is the meaning of numInputs, and the difference between >numInputs and neurons in the input layer.<br>
> > > <br>
> > > does the numlayers include all layers (input layer+hidden layer+output layer)?<br>
> > > any one can explain these?<br>
> > <br>
> > There is a difference between layers of nodes and layers of weights. The term "layer" <br>
> > in most neural network literature (including MATLAB's "numlayers") refers to weight layers. <br>
> > <br>
> > For a typical FFMLP there are 3 node layers (input,hidden,output) but only 2 weight layers (inputhidden and hiddenoutput).<br>
> > <br>
> > MATLAB's use of "numinputs" and "numoutputs" are interpreted in the vector sense: <br>
> > There is one vector input and one vector output<br>
> > <br>
> > Hidden and output nodes are associated with activation functions aka artificial neurons <br>
> > whereas the input nodes are associated with applied signals and are characterized as <br>
> > "fanin units". To be perfectly clear, there are no neurons in the input layer.<br>
> > <br>
> > Example:<br>
> > <br>
> > clear all, close all, clc<br>
> > p = randn(3,100); <br>
> > t = exp(p).*cos(p); <br>
> > [ I N ] = size(p) % [ 3 100]<br>
> > [ O N ] = size(t) % [3 100]<br>
> > Neq = N*O % 300 No. of training equations<br>
> > Hub = floor((NeqO)/(I+O+1)) % 42 Neq >= Nw Upper bound of H<br>
> > H =round(Hub/10) % 4 Neq ~ 10*Nw (want Neq >> Nw)<br>
> > Nw = (I+1)*H+(H+1)*O % 31<br>
> > % IHO = 343<br>
> > net = newff(p,t,H) % No semicolon to display characteristics.<br>
> > <br>
> > % Now investigate the contents of the net's dimensions, connections,<br>
> > % subobjects, weight and bias values.<br>
> > <br>
> > Hope this helps.<br>
> > <br>
> > Greg<br>
> <br>
> thanks for your reply.<br>
> I understand now.<br>
> I have a similar question with one guy who asked several years ago as following<br>
> <br>
> "I am trying to design a 641 network. The first three input nodes<br>
> (i.e 13) are connected with the first two (i.e 12) nodes in the<br>
> hidden layer, while the last 3 input nodes (i.e 46) are connected<br>
> fully with the last two nodes in the hidden layer (34). . All the<br>
> four hidden nodes are connected to the output node. There is no<br>
> connection between input nodes (13) and hidden nodes (34) so also<br>
> there is no connection between input nodes (46) and hidden nodes<br>
> (12)."<br>
> <br>
> how should I set the parameters of network function?<br>
> net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
<br>
Not exactly sure. I always start with full connections. Then if needed, I <br>
SEQUENTIALLY delete ineffective input nodes that are ranked last by the <br>
decrease in performance when the inputs to that node are scrambled.<br>
<br>
Can probably figure this out by looking at the properties of my 343 <br>
example.<br>
<br>
Will respond later.<br>
<br>
> I have another question.<br>
> can I train the net using [net,TR] = trainlm(net,TR,trainV,valV,testV)?<br>
> if so, how should I initialize the parameter of TR?<br>
> <br>
> thanks in advance<br>
<br>
No. If you would read the documentation<br>
<br>
help trainlm<br>
doc trainlm<br>
<br>
you will clearly see that trainlm is called by train which automatically initializes <br>
all of the inputs.<br>
<br>
Hope this helps.<br>
<br>
Greg

Mon, 05 Mar 2012 11:23:13 +0000
Re: difference between numInputs and neurons in inputlayer?
http://www.mathworks.com/matlabcentral/newsreader/view_thread/317462#868907
preben
Thanks Greg.<br>
if I cannot use trainlm directly, is it possible to use different data to train the net? I mean, use different data for train, validation and test to get the performance (plotperform).<br>
<br>
liu<br>
<br>
"Greg Heath" <heath@alumni.brown.edu> wrote in message <jirm8m$ss0$1@newscl01ah.mathworks.com>...<br>
> "preben" wrote in message <jiqr5p$qnv$1@newscl01ah.mathworks.com>...<br>
> > "Greg Heath" <heath@alumni.brown.edu> wrote in message <jiq2ev$99v$1@newscl01ah.mathworks.com>...<br>
> > > "preben" wrote in message <jilr6s$8kc$1@newscl01ah.mathworks.com>...<br>
> > > > I am going to use<br>
> > > > net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
> > > > to create a custom neural network.<br>
> > > > <br>
> > > > but I dont understand, what is the meaning of numInputs, and the difference between >numInputs and neurons in the input layer.<br>
> > > > <br>
> > > > does the numlayers include all layers (input layer+hidden layer+output layer)?<br>
> > > > any one can explain these?<br>
> > > <br>
> > > There is a difference between layers of nodes and layers of weights. The term "layer" <br>
> > > in most neural network literature (including MATLAB's "numlayers") refers to weight layers. <br>
> > > <br>
> > > For a typical FFMLP there are 3 node layers (input,hidden,output) but only 2 weight layers (inputhidden and hiddenoutput).<br>
> > > <br>
> > > MATLAB's use of "numinputs" and "numoutputs" are interpreted in the vector sense: <br>
> > > There is one vector input and one vector output<br>
> > > <br>
> > > Hidden and output nodes are associated with activation functions aka artificial neurons <br>
> > > whereas the input nodes are associated with applied signals and are characterized as <br>
> > > "fanin units". To be perfectly clear, there are no neurons in the input layer.<br>
> > > <br>
> > > Example:<br>
> > > <br>
> > > clear all, close all, clc<br>
> > > p = randn(3,100); <br>
> > > t = exp(p).*cos(p); <br>
> > > [ I N ] = size(p) % [ 3 100]<br>
> > > [ O N ] = size(t) % [3 100]<br>
> > > Neq = N*O % 300 No. of training equations<br>
> > > Hub = floor((NeqO)/(I+O+1)) % 42 Neq >= Nw Upper bound of H<br>
> > > H =round(Hub/10) % 4 Neq ~ 10*Nw (want Neq >> Nw)<br>
> > > Nw = (I+1)*H+(H+1)*O % 31<br>
> > > % IHO = 343<br>
> > > net = newff(p,t,H) % No semicolon to display characteristics.<br>
> > > <br>
> > > % Now investigate the contents of the net's dimensions, connections,<br>
> > > % subobjects, weight and bias values.<br>
> > > <br>
> > > Hope this helps.<br>
> > > <br>
> > > Greg<br>
> > <br>
> > thanks for your reply.<br>
> > I understand now.<br>
> > I have a similar question with one guy who asked several years ago as following<br>
> > <br>
> > "I am trying to design a 641 network. The first three input nodes<br>
> > (i.e 13) are connected with the first two (i.e 12) nodes in the<br>
> > hidden layer, while the last 3 input nodes (i.e 46) are connected<br>
> > fully with the last two nodes in the hidden layer (34). . All the<br>
> > four hidden nodes are connected to the output node. There is no<br>
> > connection between input nodes (13) and hidden nodes (34) so also<br>
> > there is no connection between input nodes (46) and hidden nodes<br>
> > (12)."<br>
> > <br>
> > how should I set the parameters of network function?<br>
> > net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect)<br>
> <br>
> Not exactly sure. I always start with full connections. Then if needed, I <br>
> SEQUENTIALLY delete ineffective input nodes that are ranked last by the <br>
> decrease in performance when the inputs to that node are scrambled.<br>
> <br>
> Can probably figure this out by looking at the properties of my 343 <br>
> example.<br>
> <br>
> Will respond later.<br>
> <br>
> > I have another question.<br>
> > can I train the net using [net,TR] = trainlm(net,TR,trainV,valV,testV)?<br>
> > if so, how should I initialize the parameter of TR?<br>
> > <br>
> > thanks in advance<br>
> <br>
> No. If you would read the documentation<br>
> <br>
> help trainlm<br>
> doc trainlm<br>
> <br>
> you will clearly see that trainlm is called by train which automatically initializes <br>
> all of the inputs.<br>
> <br>
> Hope this helps.<br>
> <br>
> Greg

Wed, 07 Mar 2012 00:44:51 +0000
Re: difference between numInputs and neurons in inputlayer?
http://www.mathworks.com/matlabcentral/newsreader/view_thread/317462#869162
Greg Heath
<br>
CORRECTED FOR THE HEINOUS SIN OF TOPPOSTING!<br>
<br>
On Mar 5, 6:23 am, "preben " <lzs19971...@163.com> wrote:<br>
> "Greg Heath" <he...@alumni.brown.edu> wrote in message <jirm8m$ss...@newscl01ah.mathworks.com>...<br>
> > "preben" wrote in message <jiqr5p$qn...@newscl01ah.mathworks.com>...<br>
> > > "Greg Heath" <he...@alumni.brown.edu> wrote in message <jiq2ev$99...@newscl01ah.mathworks.com>...<br>
> > > > "preben" wrote in message <jilr6s$8k...@newscl01ah.mathworks.com>...<br>
> > > > > I am going to use<br>
> > > > > net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConÂnect)<br>
> > > > > to create a custom neural network.<br>
><br>
> > > > > but I dont understand, what is the meaning of numInputs, and the difference between >numInputs and neurons in the input layer.<br>
><br>
> > > > > does the numlayers include all layers (input layer+hidden layer+output layer)?<br>
> > > > > any one can explain these?<br>
><br>
> > > > There is a difference between layers of nodes and layers of weights. The term "layer"<br>
> > > > in most neural network literature (including MATLAB's "numlayers") refers to weight layers.<br>
><br>
> > > > For a typical FFMLP there are 3 node layers (input,hidden,output) but only 2 weight layers (inputhidden and hiddenoutput).<br>
><br>
> > > > MATLAB's use of "numinputs" and "numoutputs" are interpreted in the vector sense:<br>
> > > > There is one vector input and one vector output<br>
><br>
> > > > Hidden and output nodes are associated with activation functions aka artificial neurons<br>
> > > > whereas the input nodes are associated with applied signals and are characterized as<br>
> > > > "fanin units". To be perfectly clear, there are no neurons in the input layer.<br>
><br>
> > > > Example:<br>
><br>
> > > > clear all, close all, clc<br>
> > > > p = randn(3,100);<br>
> > > > t = exp(p).*cos(p);<br>
> > > > [ I N ] = size(p) % [ 3 100]<br>
> > > > [ O N ] = size(t) % [3 100]<br>
> > > > Neq = N*O % 300 No. of training equations<br>
> > > > Hub = floor((NeqO)/(I+O+1)) % 42 Neq >= Nw Upper bound of H<br>
> > > > H =round(Hub/10) % 4 Neq ~ 10*Nw (want Neq >> Nw)<br>
> > > > Nw = (I+1)*H+(H+1)*O % 31<br>
> > > > % IHO = 343<br>
> > > > net = newff(p,t,H) % No semicolon to display characteristics.<br>
><br>
> > > > % Now investigate the contents of the net's dimensions, connections,<br>
> > > > % subobjects, weight and bias values.<br>
><br>
><br>
> > > thanks for your reply.<br>
> > > I understand now.<br>
> > > I have a similar question with one guy who asked several years ago as following<br>
><br>
> > > "I am trying to design a 641 network. The first three input nodes<br>
> > > (i.e 13) are connected with the first two (i.e 12) nodes in the<br>
> > > hidden layer, while the last 3 input nodes (i.e 46) are connected<br>
> > > fully with the last two nodes in the hidden layer (34). . All the<br>
> > > four hidden nodes are connected to the output node. There is no<br>
> > > connection between input nodes (13) and hidden nodes (34) so also<br>
> > > there is no connection between input nodes (46) and hidden nodes<br>
> > > (12)."<br>
><br>
> > > how should I set the parameters of network function?<br>
> > > net = network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConÂnect)<br>
><br>
> > Not exactly sure. I always start with full connections. Then if needed, I<br>
> > SEQUENTIALLY delete ineffective input nodes that are ranked last by the<br>
> > decrease in performance when the inputs to that node are scrambled.<br>
><br>
> > Can probably figure this out by looking at the properties of my 343<br>
> > example.<br>
><br>
> > Will respond later.<br>
<br>
I guess the only way to do this is to define 2 inputs and 3 weight<br>
layers. The<br>
first two weight layers are in parallel and each is connected to one<br>
of the inputs..<br>
<br>
> > > I have another question.<br>
> > > can I train the net using [net,TR] = trainlm(net,TR,trainV,valV,testV)?<br>
> > > if so, how should I initialize the parameter of TR?<br>
><br>
> > No. If you would read the documentation<br>
><br>
> > help trainlm<br>
> > doc trainlm<br>
><br>
> > you will clearly see that trainlm is called by train which automatically initializes<br>
> > all of the inputs.<br>
><br>
> Thanks Greg.<br>
> if I cannot use trainlm directly,<br>
<br>
Then, like everyone else use it indirectly via train.<br>
<br>
> is it possible to use different data to train the net? I mean, use different data for train, validation and test to get the performance (plotperform).<br>
<br>
Possible? That is the default: randomly selected with a 70/15/15<br>
division ratio. See the documentation<br>
for a different selection and/or ratio.<br>
<br>
Hope this helps.<br>
<br>
Greg