To train a k-nearest neighbors model, use the Classification Learner app. For greater flexibility, train a k-nearest neighbors model using
fitcknn in the command-line interface. After training, predict labels or estimate posterior probabilities by passing the model and predictor data to
|Classification Learner||Train models to classify data using supervised machine learning|
|Cross-validated k-nearest neighbor classifier|
|Classification edge for observations not used for training|
|Classification loss for observations not used for training|
|Cross validate function|
|Classification margins for observations not used for training|
|Predict response for observations not used for training|
|Loss of k-nearest neighbor classifier|
|Loss of k-nearest neighbor classifier by resubstitution|
|Compare accuracies of two classification models using new data|
|Edge of k-nearest neighbor classifier|
|Margin of k-nearest neighbor classifier|
|Compute partial dependence|
|Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots|
|Edge of k-nearest neighbor classifier by resubstitution|
|Margin of k-nearest neighbor classifier by resubstitution|
Create and compare nearest neighbor classifiers, and export trained models to make predictions for new data.
This example shows how to visualize the decision surface for different classification algorithms.
Understand the steps for supervised learning and the characteristics of nonparametric classification and regression functions.
Categorize data points based on their distance to points in a training data set, using a variety of distance metrics.
Speaker Identification Using Pitch and MFCC (Audio Toolbox)