Asked by Sree Srinivasan
on 27 Feb 2013

While training a simple network using Matlab trainbr (maximum parameters 22, effective parameters 6), I noticed that the weights and biases, 22 in all, have finite values after initialization AND after convergence. I'd have expected only "effective" 6 converged weights (and biases), with the rest being zero or NaN.

The trainbr source code shows how the effective number of parameters (gamk) is calculated, but offers no clues as to why the full suite of parameters is still populated (22 in my case) even after the code declares convergence. If only some of the parameters are ultimately effective (6 in my case), why aren't the rest of the parameters zero or undefined?

Thanks in advance for any insights

*No products are associated with this question.*

Answer by Greg Heath
on 23 Feb 2014

Accepted answer

If you use 22 parameters and trainbr says the number of effective parameters is six

1. I don't think that means you can discard 16 of them without more training.

2. It may not even mean that if you started with six, you could obtain an acceptable design.

This is just conjuncture. However, if oucan find an example, please post.

Hope this helps.

**Thank you for formally accepting my answer**

Greg

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi test

Learn moreOpportunities for recent engineering grads.

Apply Today
## 2 Comments

## Sree Srinivasan (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/65214#comment_141205

Nobody has answered this so far, but a tentative solution: At convergence, BRANN weights and biases are not unique; i.e. some weights are equal to others. So, the effective parameters gamk consist of the unique set of weights and biases. In the first hidden layer, this means BRANN is classifying the input descriptors into gamk clusters and applying the same weight to all the descriptors in each cluster.

## Alexander Nazarov (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/65214#comment_182433

Hello! At some moment of training, gamk begins highly jittering, and sometimes take negative values. Is that normal?