Superclasses:
knearest neighbor classification
A nearestneighbor classification object, where both distance metric
(“nearest”) and number of neighbors can be altered. The object classifies
new observations using the predict
method. The object contains the data
used for training, so can compute resubstitution predictions.
returns a classification model based on the input variables (also known as predictors,
features, or attributes) in the table mdl
= fitcknn(Tbl
,ResponseVarName
)Tbl
and output (response)
Tbl.ResponseVarName
.
returns a classification model based on the predictor data and class labels in the table
mdl
= fitcknn(Tbl
,formula
)Tbl
. formula
is an explanatory model of
the response and a subset of predictor variables in Tbl
used for
training.
returns a classification model based on the input variables (also known as predictors,
features, or attributes) in the table mdl
= fitcknn(Tbl
,Y
)Tbl
and output (response)
Y
.
returns a classification model based on the input variables mdl
= fitcknn(X
,Y
)X
and
output (response) Y
.
fits a model with additional options specified by one or more namevalue pair arguments,
using any of the previous syntaxes. For example, you can specify the tiebreaking
algorithm, distance metric, or observation weights.mdl
= fitcknn(___,Name,Value
)

Character vector specifying the method
Change  

Categorical predictor
indices, specified as a vector of positive integers.  

List of elements in the training data Change  

Square matrix, where Change a  

Character vector or function handle specifying the distance metric. The
allowable character vectors depend on the
For definitions, see Distance Metrics. The distance metrics of
Change If  

Character vector or function handle specifying the distance weighting function.
Change  

Additional parameter for the distance metric.
For values of the distance metric other than those in the table,
You can alter  

Expanded predictor names, stored as a cell array of character vectors. If the model uses encoding for categorical variables, then
 

Description of the crossvalidation optimization of hyperparameters,
stored as a
 

Logical value indicating whether Change  

Parameters used in training  

Numeric vector of predictor means with length
If you did not standardize  

Positive integer specifying the number of nearest neighbors in
 

Number of observations used in training  

Cell array of names for the predictor variables, in the order in which
they appear in the training data  

Numeric vector of prior probabilities for each class. The order
of the elements of Add or change a  

Character vector describing the response variable  

Numeric vector of predictor standard deviations with length
If you did not standardize  

Numeric vector of nonnegative weights with the same number of rows as
 

Numeric matrix of unstandardized predictor values. Each column of
 

A numeric vector, vector of categorical variables, logical vector,
character array, or cell array of character vectors, with the same number of
rows as

compareHoldout  Compare accuracies of two models using new data 
crossval  Crossvalidated knearest neighbor classifier 
edge  Edge of knearest neighbor classifier 
loss  Loss of knearest neighbor classifier 
margin  Margin of knearest neighbor classifier 
predict  Predict labels using knearest neighbor classification model 
resubEdge  Edge of knearest neighbor classifier by resubstitution 
resubLoss  Loss of knearest neighbor classifier by resubstitution 
resubMargin  Margin of knearest neighbor classifier by resubstitution 
resubPredict  Predict resubstitution response of knearest neighbor classifier 
Value. To learn how value classes affect copy operations, see Copying Objects (MATLAB).
The compact
function reduces the size of most classification models
by removing the training data properties, and any other properties that are not required
to predict the label of new observations. Because kNN classification
models require all of the training data to predict labels, you cannot reduce the size of
a ClassificationKNN
model.
knnsearch
finds the
knearest neighbors of points. rangesearch
finds all the points within a fixed distance. You can use
these functions for classification, as shown in Classify Query Data. If you want to perform
classification, ClassificationKNN
can be more
convenient, in that you can construct a classifier in one step and classify in other
steps. Also, ClassificationKNN
has crossvalidation
options.