A swish activation layer applies the swish function on the layer inputs.
The swish operation is given by .
NumInputs— Number of inputs
Number of inputs of the layer. This layer accepts a single input only.
InputNames— Input names
Input names of the layer. This layer accepts a single input only.
NumOutputs— Number of outputs
Number of outputs of the layer. This layer has a single output only.
OutputNames— Output names
Output names of the layer. This layer has a single output only.
Create a swish layer with the name
layer = swishLayer('Name','swish1')
layer = SwishLayer with properties: Name: 'swish1' Show all properties
Include a swish layer in a
layers = [ ... imageInputLayer([28 28 1]) convolution2dLayer(5,20) batchNormalizationLayer swishLayer maxPooling2dLayer(2,'Stride',2) fullyConnectedLayer(10) softmaxLayer classificationLayer]
layers = 8x1 Layer array with layers: 1 '' Image Input 28x28x1 images with 'zerocenter' normalization 2 '' Convolution 20 5x5 convolutions with stride [1 1] and padding [0 0 0 0] 3 '' Batch Normalization Batch normalization 4 '' Swish Swish 5 '' Max Pooling 2x2 max pooling with stride [2 2] and padding [0 0 0 0] 6 '' Fully Connected 10 fully connected layer 7 '' Softmax softmax 8 '' Classification Output crossentropyex
A swish activation layer applies the swish function on the layer inputs. The swish operation is given by . The swish layer does not change the size of its input.
Activation layers such as swish layers improve the training accuracy for some applications and usually follow convolution and normalization layers. There are other nonlinear activation layers that perform different operations. For a list of activation layers, see Activation Layers.