Why am I getting different performance results from neural network trained with 100% train, 0% validation, 0% testing?

2 views (last 30 days)
To train a neural network, I have split data into 100% train, 0% validation, 0% testing. I expect the performance result of the network to be the same across multiple runs because I am using exactly the same data for training every time.

Accepted Answer

Kerri Keng
Kerri Keng on 17 Jan 2011
This behavior is expected in Neural Network Toolbox (R2009b) in that even though data set is split in such a way that 100% train, 0% validation, 0%, performance result from that network varies across multiple runs. There are two ways that randomness can creep into the training of neural networks. The first source is how the training, testing, and validation sets are allotted. The second source of randomness is how the initial weights and biases for the network are set up. Some sort of initial values for these parameters must be present before the iterative training takes place, and since it is rarely clear which initial values are "best", some degree of randomness to explore the space of "good guesses" is needed. In the case of feedforward networks, the Nguyen-Widrow method is used, which takes into account some a priori information to narrow down the range of "good" initial values and then randomly picks the actual initial values from the narrowed subset. If one types “edit initnw” at the MATLAB Command prompt one will see that INITNW uses RAND. As a workaround, to get the same performance result every time, one can reset the RandStream as follows:
stream = RandStream.getDefaultStream; reset(stream);
  3 Comments
Greg Heath
Greg Heath on 30 Jun 2012
I wouldn't call this a workaround. It or an older alternative like rng(seed) is the only thing that makes sense.
Greg
Walter Roberson
Walter Roberson on 30 Jun 2012
Minor minor point: rnd(seed) is the newer alternative, not the older alternative. Which doesn't change Greg's point at all.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!