Is it necessary to initialized the wieghts for retraining in matlab with nntool?

6 views (last 30 days)
i am doing a speech recognition problem; using nntool; i have found most of time that during the first attempt of training; the output is not good; so i have to retrain.
but my queries is: should i need to reinitialized the weights for retrain.
What if i don't re-initialized and train again?; actually i found better result without re-initializing.
i want to know which approach is right and why?
  1 Comment
Greg Heath
Greg Heath on 9 Sep 2012
The short answer is I typically make several runs (~ 10) with random weight intialization for each setting of other parameters (e.g., No. of hidden nodes). Poor results caused by converging to a high local miimum are best cured by reinitialization.

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 7 Sep 2012
Some creation functions like newff, newfit, newpr, fitnet, patternnet and feedforwardnet are self initializing with random weights.
It is not clear if you are becoming successful by starting again with a new set of random weights or are just increasing the number epochs.
More information is needed.
  3 Comments
Greg Heath
Greg Heath on 9 Sep 2012
Several caveats
1. I am not familiar with nntool so any advice I give re that should be accepted with a grain of salt. If you switch to the command line mode you can have more confidence in my advice.
2. Many of my answers are made w.r.t. my years of practical experience in NN design and, in retirement, my 8 years experience with obsoleted MATLAB functions like newff, newfit and newpr.
3. I have recently learned that several subtle changes have been made w.r.t. current replacements of the new**s : feedforwardnet, fitnet and patternnet.
4. In particular, the newer functions do not automatically initialize weights upon creation. Weights are either initialized via configure or via train. I have not been successfull in decifering the new source code.
5. Currently I don't have the time to make the few revealing numerical experiments.
6. So, right now, I am not sure what exactly happens when train is called after configure or when train is run in multiple steps.
7. I suggest you printout the weights before and after calls to configure and 1 epoch of train. ...shouldn't take much time.
8. The next step is find out exactly what the function revert does when thrown into the mix.
Hope this helps.
Greg

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!