This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.


Class: ClassificationKNN

Cross-validated k-nearest neighbor classifier


cvmodel = crossval(mdl)
cvmodel = crossval(mdl,Name,Value)


cvmodel = crossval(mdl) creates a partitioned model from mdl, a fitted KNN classification model. By default, crossval uses 10-fold cross validation on the training data to create cvmodel.

cvmodel = crossval(mdl,Name,Value) creates a partitioned model with additional options specified by one or more Name,Value pair arguments.

Input Arguments

expand all

k-nearest neighbor classifier model, returned as a classifier model object.

Note that using the 'CrossVal', 'KFold', 'Holdout', 'Leaveout', or 'CVPartition' options results in a model of class ClassificationPartitionedModel. You cannot use a partitioned tree for prediction, so this kind of tree does not have a predict method.

Otherwise, mdl is of class ClassificationKNN, and you can use the predict method to make predictions.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.


Object of class cvpartition, created by the cvpartition function. crossval splits the data into subsets with cvpartition.

Use only one of these four options at a time: 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'.


Holdout validation tests the specified fraction of the data, and uses the remaining data for training. Specify a numeric scalar from 0 to 1. Use only one of these four options at a time: 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'.


Number of folds to use in a cross-validated tree, a positive integer greater than 1.

Use only one of these four options at a time: 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'.

Default: 10


Set to 'on' for leave-one-out cross validation.

Use only one of these four options at a time: 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'.

Output Arguments


Partitioned model of class ClassificationPartitionedModel.


expand all

Construct a cross-validated k-nearest neighbor model, and assess classification performance using the model.

Load the data.

load fisheriris
X = meas;
Y = species;

Construct a classifier for nearest neighbors.

mdl = fitcknn(X,Y);

Construct a cross-validated classifier.

cvmdl = crossval(mdl)
cvmdl = 

    CrossValidatedModel: 'KNN'
         PredictorNames: {'x1'  'x2'  'x3'  'x4'}
           ResponseName: 'Y'
        NumObservations: 150
                  KFold: 10
              Partition: [1x1 cvpartition]
             ClassNames: {'setosa'  'versicolor'  'virginica'}
         ScoreTransform: 'none'

Find the cross-validated loss of the classifier.

cvmdlloss = kfoldLoss(cvmdl)
cvmdlloss =


The cross-validated loss is less than 5%. You can expect mdl to have a similar error rate.


  • Assess the predictive performance of mdl on cross-validated data using the “kfold” methods and properties of cvmodel, such as kfoldLoss.


You can create a cross-validated model directly from the data, instead of creating a model followed by a cross-validated model. To do so, include one of these options in fitcknn: 'CrossVal', 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'.

Was this topic helpful?