How does k-fold cross validation work in KNN?
11 views (last 30 days)
My understanding is that KNN uses the classifications of the k data nearest to a query point in order to inform the classification of the query point. I was wondering how K-fold cross validation affects the ability of the KNN classifier in the classificationLearner toolbox in Matlab. My understanding of K-Fold CV is that it is used to make sure that out-of-sample data is predicted well. But also, one can use cross validation to optimize hyperparameters. Is something like this happening with KNN too? If so, what hyperparameters are being optimized through the use of KNN in classificationLearner?
Don Mathis on 13 Dec 2018
There is currently no automatic hyperparameter optimization in the classificationLearner. It just uses the hyperparameters you have chosen and runs cross-validation to estimate the out-of-sample loss.