how to train neural network using 3D matrix or some 2D matrix
1 view (last 30 days)
Show older comments
I am trying to make simple sound source localization code, each wave file passed through 64 gammatone filer channel and then enframed to 20 ms frame with 10 ms overlap , ITD and ILD have calculated for frame ,the result is a matrix (64(number of channels)x (number of frameswhich differs depending on wave length) ) for ITD and another one for ILD, the grid azimuth consists of (,+10,-10,+20 ,-20 ,...,+80,-80),,now I have a problem how to organize the data to be input to the neural network as 10 ITD matrices and 10 ILD matrices for each azimuth
0 Comments
Accepted Answer
Greg Heath
on 29 Nov 2014
Edited: Greg Heath
on 29 Nov 2014
I cannot understand your post. However, if you have N examples of I-dimensional input column vectors and the corresponding N examples of O-dimensional target column vectors, then
[ I N ] = size(input)
[ O N ] = size(target)
and the simplest code is (tr is the training record)
net = fitnet;
[ net tr output error ] = train(net, input,target);
NMSE = mse(error)/mean(var(target',1)) % Normalized mean-square error in [ 0,1 ]
For details
tr = tr % No semicolon
Hope this helps.
Thank you for formally accepting my answer
Greg
P.S. If single inputs and/or outputs are not in column vector form, columnize the corresponding rxc matrix using the columnization operator (:). Then combine to obtain the format above (e.g., I = rxc)
0 Comments
More Answers (0)
See Also
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!