Apply rectified linear unit activation
The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.
This operation is equivalent to
relu function to set negative values in
the input data to zero.
Create the input data as a single observation of random values with a height and width of 12 and 32 channels.
height = 12; width = 12; channels = 32; observations = 1; X = randn(height,width,channels,observations); dlX = dlarray(X,'SSCB');
Compute the leaky ReLU activation.
dlY = relu(dlX);
All negative values in
dlX are now set to
dlX— Input data
Input data, specified as a
dlarray with or without dimension
dlY— ReLU activations
ReLU activations, returned as a
dlarray. The output
dlY has the same underlying data type as the input
If the input data
dlX is a formatted
dlY has the same dimension labels as
dlX. If the
input data is not a formatted
dlY is an
dlarray with the same dimension order as the input data.
Usage notes and limitations:
When the input argument
dlX is a
underlying data of type
gpuArray, this function runs on the
For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).