predict
Predict labels using discriminant analysis classification model
Description
[
also
returns:label
,score
,cost
]
= predict(Mdl
,X
)
A matrix of classification scores (
score
) indicating the likelihood that a label comes from a particular class. For discriminant analysis, scores are posterior probabilities.A matrix of expected classification cost (
cost
). For each observation inX
, the predicted class label corresponds to the minimum expected classification cost among all classes.
Input Arguments
Mdl
— Discriminant analysis classification model
ClassificationDiscriminant
model object | CompactClassificationDiscriminant
model
object
Discriminant analysis classification model, specified as a ClassificationDiscriminant
or CompactClassificationDiscriminant
model
object returned by fitcdiscr
.
X
— Predictor data to be classified
numeric matrix | table
Predictor data to be classified, specified as a numeric matrix or table.
Each row of X
corresponds to one observation,
and each column corresponds to one variable. All predictor variables
in X
must be numeric vectors.
For a numeric matrix, the variables that compose the columns of
X
must have the same order as the predictor variables that trainedMdl
.For a table:
predict
does not support multicolumn variables and cell arrays other than cell arrays of character vectors.If you trained
Mdl
using a table (for example,Tbl
), then all predictor variables inX
must have the same variable names and data types as those that trainedMdl
(stored inMdl.PredictorNames
). However, the column order ofX
does not need to correspond to the column order ofTbl
.Tbl
andX
can contain additional variables (response variables, observation weights, etc.), butpredict
ignores them.If you trained
Mdl
using a numeric matrix, then the predictor names inMdl.PredictorNames
and corresponding predictor variable names inX
must be the same. To specify predictor names during training, see thePredictorNames
name-value pair argument offitcdiscr
.X
can contain additional variables (response variables, observation weights, etc.), butpredict
ignores them.
Data Types: table
| double
| single
Output Arguments
label
— Predicted class labels
categorical array | character array | logical vector | vector of numeric values | cell array of character vectors
Predicted class labels, returned as a categorical or character array, logical or numeric vector, or cell array of character vectors.
For each observation in X
, the predicted class label
corresponds to the minimum expected classification cost among all classes.
For an observation with NaN
scores, the
function classifies the observation into the majority class, which makes up the largest
proportion of the training labels.
label
:
Is the same data type as the observed class labels (
Y
) that trainedMdl
. (The software treats string arrays as cell arrays of character vectors.)Has length equal to the number of rows of
X
.
score
— Predicted class posterior probabilities
numeric matrix
Predicted class posterior probabilities,
returned as a numeric matrix of size N
-by-K
. N
is
the number of observations (rows) in X
, and K
is
the number of classes (in Mdl.ClassNames
). score(i,j)
is
the posterior probability that observation i
in X
is
of class j
in Mdl.ClassNames
.
cost
— Expected classification costs
numeric matrix
Expected classification
costs, returned as a matrix of size N
-by-K
. N
is
the number of observations (rows) in X
, and K
is
the number of classes (in Mdl.ClassNames
). cost(i,j)
is
the cost of classifying row i
of X
as
class j
in Mdl.ClassNames
.
Examples
Predict Class Labels Using Discriminant Analysis Model
Load Fisher's iris data set. Determine the sample size.
load fisheriris
N = size(meas,1);
Partition the data into training and test sets. Hold out 10% of the data for testing.
rng(1); % For reproducibility cvp = cvpartition(N,'Holdout',0.1); idxTrn = training(cvp); % Training set indices idxTest = test(cvp); % Test set indices
Store the training data in a table.
tblTrn = array2table(meas(idxTrn,:)); tblTrn.Y = species(idxTrn);
Train a discriminant analysis model using the training set and default options.
Mdl = fitcdiscr(tblTrn,'Y');
Predict labels for the test set. You trained Mdl
using a table of data, but you can predict labels using a matrix.
labels = predict(Mdl,meas(idxTest,:));
Construct a confusion matrix for the test set.
confusionchart(species(idxTest),labels)
Mdl
misclassifies one versicolor iris as virginica in the test set.
Plot Class Posterior Probability Regions
Load Fisher's iris data set. Consider training using the petal lengths and widths only.
load fisheriris
X = meas(:,3:4);
Train a quadratic discriminant analysis model using the entire data set.
Mdl = fitcdiscr(X,species,'DiscrimType','quadratic');
Define a grid of values in the observed predictor space. Predict the posterior probabilities for each instance in the grid.
xMax = max(X); xMin = min(X); d = 0.01; [x1Grid,x2Grid] = meshgrid(xMin(1):d:xMax(1),xMin(2):d:xMax(2)); [~,score] = predict(Mdl,[x1Grid(:),x2Grid(:)]); Mdl.ClassNames
ans = 3x1 cell
{'setosa' }
{'versicolor'}
{'virginica' }
score
is a matrix of class posterior probabilities. The columns correspond to the classes in Mdl.ClassNames
. For example, score(j,1)
is the posterior probability that observation j
is a setosa iris.
Plot the posterior probability of versicolor classification for each observation in the grid and plot the training data.
figure; contourf(x1Grid,x2Grid,reshape(score(:,2),size(x1Grid,1),size(x1Grid,2))); h = colorbar; caxis([0 1]); colormap jet; hold on gscatter(X(:,1),X(:,2),species,'mcy','.x+'); axis tight title('Posterior Probability of versicolor'); hold off
The posterior probability region exposes a portion of the decision boundary.
More About
Posterior Probability
The posterior probability that a point x belongs to class k is the product of the prior probability and the multivariate normal density. The density function of the multivariate normal with 1-by-d mean μk and d-by-d covariance Σk at a 1-by-d point x is
where is the determinant of Σk, and is the inverse matrix.
Let P(k) represent the prior probability of class k. Then the posterior probability that an observation x is of class k is
where P(x) is a normalization constant, the sum over k of P(x|k)P(k).
Prior Probability
The prior probability is one of three choices:
'uniform'
— The prior probability of classk
is one over the total number of classes.'empirical'
— The prior probability of classk
is the number of training samples of classk
divided by the total number of training samples.Custom — The prior probability of class
k
is thek
th element of theprior
vector. Seefitcdiscr
.
After creating a classification model (Mdl
)
you can set the prior using dot notation:
Mdl.Prior = v;
where v
is a vector of positive elements
representing the frequency with which each element occurs. You do
not need to retrain the classifier when you set a new prior.
Cost
The matrix of expected costs per observation is defined in Cost.
Predicted Class Label
predict
classifies so as to minimize the expected
classification cost:
where
is the predicted classification.
K is the number of classes.
is the posterior probability of class k for observation x.
is the cost of classifying an observation as y when its true class is k.
Extended Capabilities
Tall Arrays
Calculate with arrays that have more rows than fit in memory.
This function fully supports tall arrays. You can use models trained on either in-memory or tall data with this function.
For more information, see Tall Arrays.
C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.
Usage notes and limitations:
Use
saveLearnerForCoder
,loadLearnerForCoder
, andcodegen
(MATLAB Coder) to generate code for thepredict
function. Save a trained model by usingsaveLearnerForCoder
. Define an entry-point function that loads the saved model by usingloadLearnerForCoder
and calls thepredict
function. Then usecodegen
to generate code for the entry-point function.To generate single-precision C/C++ code for
predict
, specify the name-value argument"DataType","single"
when you call theloadLearnerForCoder
function.This table contains notes about the arguments of
predict
. Arguments not included in this table are fully supported.Argument Notes and Limitations Mdl
For the usage notes and limitations of the model object, see Code Generation of the
CompactClassificationDiscriminant
object.X
X
must be a single-precision or double-precision matrix or a table containing numeric variables.The number of rows, or observations, in
X
can be a variable size, but the number of columns inX
must be fixed.If you want to specify
X
as a table, then your model must be trained using a table, and your entry-point function for prediction must do the following:Accept data as arrays.
Create a table from the data input arguments and specify the variable names in the table.
Pass the table to
predict
.
For an example of this table workflow, see Generate Code to Classify Data in Table. For more information on using tables in code generation, see Code Generation for Tables (MATLAB Coder) and Table Limitations for Code Generation (MATLAB Coder).
For more information, see Introduction to Code Generation.
Version History
Introduced in R2011b
Open Example
You have a modified version of this example. Do you want to open this example with your edits?
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list:
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)