Documentation Center

  • Trial Software
  • Product Updates

Akaike's Criteria for Model Validation

Definition of FPE

Akaike's Final Prediction Error (FPE) criterion provides a measure of model quality by simulating the situation where the model is tested on a different data set. After computing several different models, you can compare them using this criterion. According to Akaike's theory, the most accurate model has the smallest FPE.

    Note:   If you use the same data set for both model estimation and validation, the fit always improves as you increase the model order and, therefore, the flexibility of the model structure.

Akaike's Final Prediction Error (FPE) is defined by the following equation:

where V is the loss function, d is the number of estimated parameters, and N is the number of values in the estimation data set.

The toolbox assumes that the final prediction error is asymptotic for d<<N and uses the following approximation to compute FPE:

The loss function V is defined by the following equation:

where represents the estimated parameters.

Computing FPE

You can compute Akaike's Final Prediction Error (FPE) criterion for linear and nonlinear models.

    Note:   FPE for nonlinear ARX models that include a tree partition nonlinearity is not supported.

To compute FPE, use the fpe command, as follows:

FPE = fpe(m1,m2,m3,...,mN)

According to Akaike's theory, the most accurate model has the smallest FPE.

You can also access the FPE value of an estimated model , m, as follows:

  • If m is a linear model, type m.Report.Fit.FPE

  • If m is a nonlinear model, type m.EstimationInfo.FPE

Definition of AIC

Akaike's Information Criterion (AIC) provides a measure of model quality by simulating the situation where the model is tested on a different data set. After computing several different models, you can compare them using this criterion. According to Akaike's theory, the most accurate model has the smallest AIC.

    Note:   If you use the same data set for both model estimation and validation, the fit always improves as you increase the model order and, therefore, the flexibility of the model structure.

Akaike's Information Criterion (AIC) is defined by the following equation:

where V is the loss function, d is the number of estimated parameters, and N is the number of values in the estimation data set.

The loss function V is defined by the following equation:

where represents the estimated parameters.

For d<<N:

    Note:   AIC is approximately equal to log(FPE).

Computing AIC

Use the aic command to compute Akaike's Information Criterion (AIC) for one or more linear or nonlinear models, as follows:

AIC = aic(m1,m2,m3,...,mN)

According to Akaike's theory, the most accurate model has the smallest AIC.

Was this topic helpful?