Train a softmax layer for classification
Load the sample data.
[X,T] = iris_dataset;
X is a 4x150 matrix of four attributes of iris flowers: Sepal length, sepal width, petal length, petal width.
T is a 3x150 matrix of associated class vectors defining which of the three classes each input is assigned to. Each row corresponds to a dummy variable representing one of the iris species (classes). In each column, a 1 in one of the three rows represents the class that particular sample (observation or example) belongs to. There is a zero in the rows for the other classes that the observation does not belong to.
Train a softmax layer using the sample data.
net = trainSoftmaxLayer(X,T); nntraintool('close')
Classify the observations into one of the three classes using the trained softmax layer.
Y = net(X);
Plot the confusion matrix using the targets and the classifications obtained from the softmax layer.
X— Training data
Training data, specified as an m-by-n matrix,
where m is the number of variables in training
data, and n is the number of observations (examples).
Hence, each column of
X represents a sample.
T— Target data
Target data, specified as a k-by-n matrix, where k is the number of classes, and n is the number of observations. Each row is a dummy variable representing a particular class. In other words, each column represents a sample, and all entries of a column are zero except for a single one in a row. This single entry indicates the class for that sample.
comma-separated pairs of
the argument name and
Value is the corresponding value.
Name must appear inside quotes. You can specify several name and value
pair arguments in any order as
'MaxEpochs',400,'ShowProgressWindow',falsespecifies the maximum number of iterations as 400 and hides the training window.
'MaxEpochs'— Maximum number of training iterations
Maximum number of training iterations, specified as the comma-separated
pair consisting of
'MaxEpochs' and a positive integer
'LossFunction'— Loss function for the softmax layer
Loss function for the softmax layer, specified as the comma-separated
pair consisting of
'LossFunction' and either
mse stands for mean squared error function,
which is given by:
where n is
the number of training examples, and k is the number
of classes. is
the ijth entry of the target matrix,
the ith output from the autoencoder when the input
vector is xj.
The cross entropy function is given by:
'ShowProgressWindow'— Indicator to display the training window
Indicator to display the training window during training, specified
as the comma-separated pair consisting of
'TrainingAlgorithm'— Training algorithm
Training algorithm used to train the softmax layer, specified as the comma-separated pair
'trainscg', which stands for
scaled conjugate gradient.