Cross-validation loss of partitioned regression model
L = kfoldLoss(cvmodel)
L = kfoldLoss(cvmodel,Name,Value)
L = kfoldLoss(cvmodel,Name,Value) returns cross-validation loss with additional options specified by one or more Name,Value pair arguments. You can specify several name-value pair arguments in any order as Name1,Value1,…,NameN,ValueN.
Object of class RegressionPartitionedModel. Create obj with fitrtree along with one of the cross-validation options: 'CrossVal', 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'. Alternatively, create obj from a regression tree with crossval.
Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.
Indices of folds ranging from 1 to obj.KFold. Use only these folds for predictions.
Function handle for loss function, or the string 'mse', meaning mean squared error. If you pass a function handle fun, kfoldLoss calls it as
where Y, Yfit, and W are numeric vectors of the same length.
The returned value fun(Y,Yfit,W) should be a scalar.
One of the following strings:
The loss (mean squared error) between the observations in a fold when compared against predictions made with a tree trained on the out-of-fold data. If mode is 'individual', L is a vector of the losses. If mode is 'average', L is the average loss.
Construct a partitioned regression model, and examine the cross-validation losses for the folds:
load carsmall XX = [Cylinders Displacement Horsepower Weight]; YY = MPG; cvmodel = fitrtree(XX,YY,'crossval','on'); L = kfoldLoss(cvmodel,'mode','individual') L = 44.9635 11.8525 18.2046 9.2965 29.4329 54.8659 24.6446 8.2085 19.7593 16.7394