This is machine translation

Translated by Microsoft
Mouse over text to see original. Click the button below to return to the English verison of the page.

Naive Bayes

Naive Bayes model with Gaussian, multinomial, or kernel predictors

Naive Bayes models assume that observations have some multivariate distribution given class membership, but the predictor or features composing the observation are independent. This framework can accommodate a complete feature set such that an observation is a set of multinomial counts.

To train a naive Bayes model, use fitcnb in the command-line interface. After training, predict labels or estimate posterior probabilities by passing the model and predictor data to predict.


fitcnb Train multiclass naive Bayes model
predict Predict labels using naive Bayes classification model
templateNaiveBayes Naive Bayes classifier template


ClassificationNaiveBayes Naive Bayes classification
CompactClassificationNaiveBayes Compact naive Bayes classifier
ClassificationPartitionedModel Cross-validated classification model

Examples and How To

Steps in Supervised Learning

While there are many Statistics and Machine Learning Toolbox™ algorithms for supervised learning, most use the same basic workflow for obtaining a predictor model.


Characteristics of Classification Algorithms

Classification algorithms vary in speed, memory usage, interpretability, and flexibility.

Naive Bayes Classification

The naive Bayes classifier is designed for use when predictors are independent of one another within each class, but it appears to work well in practice even when that independence assumption is not valid.

Supported Distributions

Learn how the naive Bayes classification model supports normal (Gaussian), kernel, multinomial, and multivariate, multinomial predictor conditional distributions.

Was this topic helpful?