Path: news.mathworks.com!not-for-mail
From: "Greg Heath" <heath@alumni.brown.edu>
Newsgroups: comp.soft-sys.matlab
Subject: Re: Using weights of a trained neural network
Date: Thu, 21 Mar 2013 20:09:06 +0000 (UTC)
Organization: The MathWorks, Inc.
Lines: 48
Message-ID: <kifpd2$gee$1@newscl01ah.mathworks.com>
References: <kicqhr$spo$1@newscl01ah.mathworks.com> <kie99m$ega$1@newscl01ah.mathworks.com> <kieq7u$svk$1@newscl01ah.mathworks.com>
Reply-To: "Greg Heath" <heath@alumni.brown.edu>
NNTP-Posting-Host: www-00-blr.mathworks.com
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Trace: newscl01ah.mathworks.com 1363896546 16846 172.30.248.45 (21 Mar 2013 20:09:06 GMT)
X-Complaints-To: news@mathworks.com
NNTP-Posting-Date: Thu, 21 Mar 2013 20:09:06 +0000 (UTC)
X-Newsreader: MATLAB Central Newsreader 2929937
Xref: news.mathworks.com comp.soft-sys.matlab:791726

"Vito" wrote in message <kieq7u$svk$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <kie99m$ega$1@newscl01ah.mathworks.com>...
> > "Vito" wrote in message <kicqhr$spo$1@newscl01ah.mathworks.com>...
> > > I trained a neural network using the MATLAB Neural Network Toolbox, and in particular using the command nprtool. After that, I exported a structure (called 'net') containing the informations about the NN generated.
> > > 
> > > In this way, I created a working neural network, that I can use as classifier.
> > > This network has 200 inputs, 20 neurons in the first hidden layer, and 2 neurons in the last layer that provide a bidimensional output.
> > > 
> > > What I want to do is to use the weights in order to obtain the same results that I get using sim() function.
> > > 
> > > In order to solve this problem, I try to use the following code:
> > > 
> > > y1 = tansig(net.IW{1} * input + net.b{1});
> > > Results = tansig(net.LW{2} * y1 + net.b{2});
> > > 
> > > Assuming that input is a monodimensional array of 200 elements, the previous code would work if net.IW{1} is a 20x200 matrix (20 neurons, 200 weights).
> > > 
> > > The problem is that I noticed that size(net.IW{1}) returns unexpected values:
> > > 
> > > >> size(net.IW{1})
> > > 
> > >     ans =
> > > 
> > >     20   199
> > > 
> > > I got the same problem with a network with 10000 input. In this case, the result wasn't 20x10000, but something like 20x9384 (I don't remember the exact value).
> > > 
> > > So, the question is: how can I make work the above code?
> > 
> > You need to find the dimension bugs. Post your code and we may be able to help.
> > 
> > As far as replicating the net function, don't forget that IW is acting on normalized inputs.
> > 
> > Hope this helps.
> > 
> > Greg
> 
> 
> I've solved removing the preprocessing and postprocessing functions.
> 
> You can see the solution here:
> http://stackoverflow.com/questions/15526112/porting-a-neural-network-trained-with-matlab-in-other-programming-languages/15537848#15537848

It is always satisfying to find a solution to a nagging problem.

However, doesn't it nag you that you don't know exactly what caused the problem in the first place?

Greg