Apply leaky rectified linear unit activation
The leaky rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is multiplied by a fixed scale factor.
This operation is equivalent to
leakyrelu function to scale negative
values in the input data.
Create the input data as a single observation of random values with a height and width of 12 and 32 channels.
height = 12; width = 12; channels = 32; observations = 1; X = randn(height,width,channels,observations); dlX = dlarray(X,'SSCB');
Compute the leaky ReLU activation using a scale factor of
for the negative values in the input.
dlY = leakyrelu(dlX,0.05);
dlX— Input data
Input data, specified as a
dlarray with or without dimension
scaleFactor— Scale factor for negative inputs
0.01(default) | numeric scalar
Scale factor for negative inputs, specified as a numeric scalar. The default value
dlY— Leaky ReLU activations
Leaky ReLU activations, returned as a
dlarray. The output
dlY has the same underlying data type as the input
If the input data
dlX is a formatted
dlY has the same dimension labels as
dlX. If the
input data is not a formatted
dlY is an
dlarray with the same dimension order as the input data.
Usage notes and limitations:
When the input argument
dlX is a
gpuArray or a
dlarray with underlying data of type
this function runs on the GPU.
For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).