What is the Normalized MSE algorithm for the NN performance?

Asked by Hugo Mendonça

Hugo Mendonça (view profile)

on 14 Jun 2015
Latest activity Answered by Greg Heath

Greg Heath (view profile)

on 16 Jun 2015
Hi, everyone!
To verify the performance of a neural network, the NN toolbox calculate the MSE (mean squared error). Besides, there is the possibility to calculate the same MSE normalized setting 'standard' or 'percent'.
I have looked for the algorithm to calculate both of them with no success. So, does anyone know how matlab normalizes the MSE?
Many thanks in advance!
Hugo

Tags

Answer by Greg Heath

Greg Heath (view profile)

on 16 Jun 2015

The purpose of a regression or curve-fitting net is, given the input signal variations, to model the corresponding target variations.
The average biased (e.g., divide by N) target variance is
MSE00 = mean(var(t'),1)
When adjusted (e.g., dividing by N-1) for the bias of using the estimate of the mean from the same data, the unbiased target variance is
MSE00a = mean(var(t'),0)
It is not difficult to show that MSE00 is the minimum mean-square-error resulting from a naïve constant output model. Of course, the minimum occurs when the constant is just the mean of the target. Consequently, the result is the variance.
When trying to model target variations, the constant output model is probably the most useful reference. This results in the scale-free entitities
NMSE = mse(t-y)/MSE00 % Normalized MSE
and
R2 = 1- NMSE % Rsquare (AKA R^2 and the coefficient of determination)
Rsquare is interpreted as the fraction of target variance that is modelled by the net.