Greyboxeval - Model quality evaluation

For models residues=model(data,parameters) with data sets at different experimental conditions
499 Downloads
Updated 8 Apr 2013

View License

If the model residuals can not be predicted (i.e. are random) there is little prospect of improving the model. So a method of evaluation is testing if the residues can be predicted by the experimental conditions. This may indirectly indicate the what terms are needed to improve the model.


A constructive method that does not require the residues r_i to be in the same form over the different experimental conditions c_i is to determine if the model can be improved by using the operating conditions to adjust parameter values within the model. For the ith data set
r_i=model(data_i,p_i)
and we search for a matrix relation
p_i = A c_i+b_i 
where A is to be determined and b_i is typically zero. Non linear relations are easily handled adding transforms to the c_i vector such as polynomial or spline bases. Also c_i typically contains a constant term. It is also possible for c_i to be matrix.

To use efficient linear regression methods the model is numerically inverted (see references) so that the model parameters are in linear position about a zero origin (or about b_i) but the linearisation applies close to the optimum fit to the data.


The function greyboxeval calculates the probability that the improvement found in the root mean squared error could have been made using random operating conditions. So that a model that could not be improved will give a probability about 0.5 (say 0.1 or larger). A lower probability indicates the improvement seen is unlikely to have been generated by random data and the possibility of improving the model exists. The linear regression used to calculate the A values is regularised by removing small eigenvalues.

To use greyboxeval the user must supply two functions that give information about each data set. The first:
[c,ps,b]=condfn(data,i)
takes the user data in any convenient form, typically an array or a struct, and the number of the data set and returns the ith experiential conditions and optionally a scale/initial value ps for the parameters, and more optionally the corresponding value of b. Note that if the second argument ps is not given, an initial estimate of the matrix A must be given (see help for greyboxeval).

The second function that user must supply is:
[r,dr]=resfn(data,i,p)
takes the data and a parameter value p and for the ith data set calculates the residuals and optionally their derivatives wrt p. This function will contain the user's model and compare the model outputs with the data to determine the residuals. The residuals can be scaled (eg by dividing by a standard deviation estimate) before being returned by this function. If the derivative matrix is not given numeric derivative are calculated, except when p is non zero between −1e-150 and 1e-150.
The function r=greyboxcheck(resfn,condfn,data,ndata,optn) should be used to check the operation of user functions before proceeding to the greyboxeval function, as once minimisation is applied errors are reduced and mistakes can be hidden. The function greuboxcheck supplies the model residues and operating conditions that can be used for regression to see if the operating conditions can predict the residues.

The function greyboxeval:
[prob,rms0,rms1,info]=greyboxeval(resfn,condfn,data,vndata,optn)
takes the two user functions, the data, the numbers of data sets to use (eg 1:n), and some options including optn.Amat for an initial value of the A matrix, optn.Acode for specifying constant values in A, and returns a probability estimate. The output rms0 is the root mean squared error for the model as provided, and rms1 is the root mean square error for the improved model..

Greyboxeval calculates a probability estimate of the proportion of fits using random predictors, that give the same reduction in sum of squares as seen in the fit (via an f-ratio test). Hence a low probability value indicates a useful improvement to the model is possible, while a value above 0.1 indicate a significant chance that the improvement seen can be produced by random predictors and thus is not a useful reduction.
The probability is an estimate and for a large number of error terms it become very sensitive to small, and often in practice irrelevant changes, to the sum of squares and root mean squared errors. For comparison the original and fitted root mean squared error values are returned.
In common with most least squares methods the result is sensitive to outliers, and the data should be checked for outliers before this program is used.

Whiten, W.J., Determination of parameter relations within non-linear models, SIGNUM Newsletter, 29(3-4,) 2-5, 1994.

Xiao, J., Extensions of model building techniques and their applications in mineral processing, PhD thesis, The University of Queensland, 1998.

Cite As

Bill Whiten (2024). Greyboxeval - Model quality evaluation (https://www.mathworks.com/matlabcentral/fileexchange/40250-greyboxeval-model-quality-evaluation), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2012b
Compatible with any release
Platform Compatibility
Windows macOS Linux
Categories
Find more on Verification, Validation, and Test in Help Center and MATLAB Answers

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Published Release Notes
1.2.0.0

2013-04-06
Corrected text: percent -> probability
Corrected variance calc in linfitreg - not used by greyboxeval

1.0.0.0