Main Content

preluLayer

Parametrized Rectified Linear Unit (PReLU) layer

Since R2024a

Description

A PReLU layer performs a threshold operation, where for each channel, any input value less than zero is multiplied by a scalar learned at training time.

This operation is equivalent to:

f(xi)={xiif xi>0αixiif xi0

Creation

Description

layer = preluLayer returns a parametrized ReLU layer.

example

layer = preluLayer(Name=Value) returns a parametrized ReLU layer and sets the optional Name and Alpha properties. For example, preluLayer(Alpha=2,Name="prelu1") creates a PReLU layer with the optional Alpha and Name properties.

Properties

expand all

PReLU

Learnable multiplier for negative input values, specified either as a numeric scalar, or vector, or a matrix. The size of Alpha must be compatible with the input size of the PReLU layer. If the sizes of Alpha and the input of the PReLU layer are compatible, then the two arrays implicitly expand to match each other. For example, if Alpha is a scalar, then the scalar is combined with each element of the other array. Also, vectors with different orientations (one row vector and one column vector) implicitly expand to form a matrix.

The network learns the parameter Alpha during training.

Example: 0.4

Layer

Layer name, specified as a character vector or a string scalar. For Layer array input, the trainnet and dlnetwork functions automatically assign names to layers with the name "".

The PReLULayer object stores this property as a character vector.

Data Types: char | string

This property is read-only.

Number of inputs to the layer, returned as 1. This layer accepts a single input only.

Data Types: double

This property is read-only.

Input names, returned as {'in'}. This layer accepts a single input only.

Data Types: cell

This property is read-only.

Number of outputs from the layer, returned as 1. This layer has a single output only.

Data Types: double

This property is read-only.

Output names, returned as {'out'}. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create a PReLU layer with the name "prelu1".

layer = preluLayer(Name="prelu1")
layer = 
  PReLULayer with properties:

     Name: 'prelu1'

   Learnable Parameters
    Alpha: 0.2500

   State Parameters
    No properties.

Use properties method to see a list of all properties.

Include a PReLU layer in a Layer array.

layers = [
    imageInputLayer([28 28 1])
    convolution2dLayer(3,16)
    batchNormalizationLayer
    preluLayer
 
    maxPooling2dLayer(2,Stride=2)
    convolution2dLayer(3,32)
    batchNormalizationLayer
    preluLayer
    
    fullyConnectedLayer(10)
    softmaxLayer]
layers = 
  10x1 Layer array with layers:

     1   ''   Image Input           28x28x1 images with 'zerocenter' normalization
     2   ''   2-D Convolution       16 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   Batch Normalization   Batch normalization
     4   ''   PReLU                 PReLU
     5   ''   2-D Max Pooling       2x2 max pooling with stride [2  2] and padding [0  0  0  0]
     6   ''   2-D Convolution       32 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     7   ''   Batch Normalization   Batch normalization
     8   ''   PReLU                 PReLU
     9   ''   Fully Connected       10 fully connected layer
    10   ''   Softmax               softmax

Algorithms

expand all

References

[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. ICML, vol. 30, no. 1. 2013.

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Version History

Introduced in R2024a