Naive Bayes models assume that observations have some multivariate distribution given class membership, but the predictor or features composing the observation are independent. This framework can accommodate a complete feature set such that an observation is a set of multinomial counts.
To train a naive Bayes model, use
fitcnb in the command-line interface. After training, predict labels or estimate posterior probabilities by passing the model and predictor data to
|Classification Learner||Train models to classify data using supervised machine learning|
|Cross-validate naive Bayes classifier|
|Classification edge for observations not used for training|
|Classification loss for observations not used for training|
|Cross validate function|
|Classification margins for observations not used for training|
|Predict response for observations not used for training|
|Classification loss for naive Bayes classifier|
|Resubstitution classification loss for naive Bayes classifier|
|Log unconditional probability density for naive Bayes classifier|
|Compare accuracies of two classification models using new data|
|Classification edge for naive Bayes classifier|
|Classification margins for naive Bayes classifier|
|Compute partial dependence|
|Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots|
|Resubstitution classification edge for naive Bayes classifier|
|Resubstitution classification margins for naive Bayes classifier|
Create and compare naive Bayes classifiers, and export trained models to make predictions for new data.
Understand the steps for supervised learning and the characteristics of nonparametric classification and regression functions.
Categorical response data
The naive Bayes classifier is designed for use when predictors are independent of one another within each class, but it appears to work well in practice even when that independence assumption is not valid.
This example shows how to visualize classification probabilities for the Naive Bayes classification algorithm.
This example shows how to perform classification using discriminant analysis, naive Bayes classifiers, and decision trees.
This example shows how to visualize the decision surface for different classification algorithms.