Regularization techniques are used to prevent statistical overfitting in a predictive model. By introducing additional information into the model, regularization algorithms make the model more parsimonious and accurate. These algorithms typically work by applying a penalty for complexity such as by adding the coefficients of the model into the minimization or including a roughness penalty.
You can use MATLAB and Statistics Toolbox to apply regularization techniques. Statistics Toolbox includes functions for ridge regression (also known as Tikhonov regularization), lasso, and elastic net algorithms, as well as plotting options to display trace plots and cross-validated mean square error. You can also apply Akaike informatics criteria as a goodness-of-fit metric.
See also: Machine Learning