# Documentation

### This is machine translation

Translated by
Mouseover text to see original. Click the button below to return to the English verison of the page.

# classificationLayer

Create classification output layer

## Syntax

``coutputlayer = classificationLayer()``
``coutputlayer = classificationLayer('Name',Name)``

## Description

````coutputlayer = classificationLayer()` returns a classification output layer for a neural network. The classification output layer holds the name of the loss function that the software uses for training the network for multi-class classification, the size of the output, and the class labels.```

example

````coutputlayer = classificationLayer('Name',Name)` returns a classification layer with name specified by `name`.```

## Examples

collapse all

Create a classification output layer with the name `'coutput'`.

`layer = classificationLayer('Name','coutput')`
```layer = ClassificationOutputLayer with properties: Name: 'coutput' ClassNames: {1x0 cell} OutputSize: 'auto' Hyperparameters LossFunction: 'crossentropyex' ```

The default loss function for classification is cross entropy for mutually exclusive classes.

Include a classification output layer in a `Layer` array.

```layers = [ ... imageInputLayer([28 28 1]) convolution2dLayer(5,20) reluLayer maxPooling2dLayer(2,'Stride',2) fullyConnectedLayer(10) softmaxLayer classificationLayer]```
```layers = 7x1 Layer array with layers: 1 '' Image Input 28x28x1 images with 'zerocenter' normalization 2 '' Convolution 20 5x5 convolutions with stride [1 1] and padding [0 0 0 0] 3 '' ReLU ReLU 4 '' Max Pooling 2x2 max pooling with stride [2 2] and padding [0 0 0 0] 5 '' Fully Connected 10 fully connected layer 6 '' Softmax softmax 7 '' Classification Output crossentropyex ```

## Input Arguments

collapse all

Layer name, specified as a character vector. If `Name` is set to `''`, then the software automatically assigns a name at training time.

Data Types: `char`

## Output Arguments

collapse all

Classification output layer, returned as a `ClassificationOutputLayer` object.

For information on concatenating layers to construct convolutional neural network architecture, see `Layer`.

collapse all

### Cross Entropy Function for k Mutually Exclusive Classes

For multiclass classification problems, the software assigns each input to one of the k mutually exclusive classes. The loss (error) function for this case is the cross entropy function for a 1-of-k coding scheme:

`$E\left(\theta \right)=-\sum _{i=1}^{n}\sum _{j=1}^{k}{t}_{ij}\mathrm{ln}{y}_{j}\left({x}_{i},\theta \right),$`
where $\theta$ is the parameter vector, ${t}_{ij}$ is the indicator that the ith sample belongs to the jth class, and ${y}_{j}\left({x}_{i},\theta \right)$ is the output for sample i. The output ${y}_{j}\left({x}_{i},\theta \right)$ can be interpreted as the probability that the network associates ith input with class j, that is, $P\left({t}_{j}=1|{x}_{i}\right)$.

The output unit activation function is the softmax function:

`${y}_{r}\left(x\right)=\frac{\mathrm{exp}\left({a}_{r}\left(x\right)\right)}{\sum _{j=1}^{k}\mathrm{exp}\left({a}_{j}\left(x\right)\right)},$`

where $0\le {y}_{r}\le 1$ and $\sum _{j=1}^{k}{y}_{j}=1$.

## References

[1] Bishop, C. M. Pattern Recognition and Machine Learning. Springer, New York, NY, 2006.