Nonlinear regression is a statistical technique that helps describe nonlinear relationships in experimental data. Nonlinear regression models are generally assumed to be parametric, where the model is described as a nonlinear equation. Typically machine learning methods are used for non-parametric nonlinear regression.
Parametric nonlinear regression models the dependent variable (also called the response) as a function of a combination of nonlinear parameters and one or more independent variables (called predictors). The model can be univariate (single response variable) or multivariate (multiple response variables).
The parameters can take the form of an exponential, trigonometric, power, or any other nonlinear function. To determine the nonlinear parameter estimates, an iterative algorithm is typically used.
where, \(\beta\) represents nonlinear parameter estimates to be computed and \(\epsilon\) represents the error terms.
Popular algorithms for fitting a nonlinear regression include:
- Gauss-Newton algorithm
- Gradient descent algorithm
- Levenberg-Marquardt algorithm
For these and other functions for parametric regression as well as for stepwise, robust, univariate, and multivariate regression, see Statistics and Machine Learning Toolbox™. It can be used to:
- Fit a nonlinear model to data and compare different models
- Generate predictions
- Evaluate parameter confidence intervals
- Evaluate goodness-of-fit