|Regression Learner||Train regression models to predict data using supervised machine learning|
|Fit a Gaussian process regression (GPR) model|
|Predict response of Gaussian process regression model|
|Regression error for Gaussian process regression model|
|Create compact Gaussian process regression model|
|Cross-validate Gaussian process regression model|
|Local interpretable model-agnostic explanations (LIME)|
|Compute partial dependence|
|Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots|
|Compute post-fit statistics for the exact Gaussian process regression model|
|Resubstitution loss for a trained Gaussian process regression model|
|Resubstitution prediction from a trained Gaussian process regression model|
Gaussian process regression (GPR) models are nonparametric kernel-based probabilistic models.
In Gaussian processes, the covariance function expresses the expectation that points with similar predictor values will have similar response values.
Learn the parameter estimation and prediction in exact GPR method.
With large data sets, the subset of data approximation method can greatly reduce the time required to train a Gaussian process regression model.
The subset of regressors approximation method replaces the exact kernel function by an approximation.
The fully independent conditional (FIC) approximation is a way of systematically approximating the true GPR kernel function in a way that avoids the predictive variance problem of the SR approximation while still maintaining a valid Gaussian process.
Block coordinate descent approximation is another approximation method used to reduce computation time with large data sets.