Is a Bi-GRU available - bidirectional Gated Recurrent Unit (GRU) - or a way to implement a Bi-GRU?

93 views (last 30 days)
The following artificial recurrent neural network (RNN) architectures are available:
Whereas, I would like to know if an Bi-GRU exists or can be defined?
Thank you for your help.

Accepted Answer

Amanjit Dulai
Amanjit Dulai on 21 Oct 2021
Edited: Amanjit Dulai on 22 Nov 2022
A bi-LSTM layer works by applying two LSTM layers on the data; one in the forward direction and one in the reverse direction. You can apply an LSTM function in the reverse direction by flipping the data. The results from these two LSTM layers is then concatenated together to form the output of the bi-LSTM layer. So if we want to implement a bi-GRU layer, we can do this by using a custom flip layer together with GRU layers. A custom flip layer can be implemented as follows:
classdef FlipLayer < nnet.layer.Layer
methods
function layer = FlipLayer(name)
layer.Name = name;
end
function Y = predict(~, X)
Y = flip(X,3);
end
end
end
We can implement a bi-GRU layer with OutputMode="sequence" by arranging layers in the way shown below:
We can implement a bi-GRU layer with with OutputMode="last" by arranging layers in the way shown below:
Below is a short example showing how to use the custom flip layer mentioned above to implement a network with two bi-GRU layers (one with OutputMode="sequence" and oine with OutputMode="last"):
[XTrain, YTrain] = japaneseVowelsTrainData;
lg = layerGraph();
lg = addLayers(lg, [
sequenceInputLayer(12, "Name", "input")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru1")
concatenationLayer(1, 2, "Name", "cat1")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru3")
concatenationLayer(1, 2, "Name", "cat2")
fullyConnectedLayer(9)
softmaxLayer
classificationLayer()] );
lg = addLayers( lg, [
FlipLayer("flip1")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru2" )
FlipLayer("flip2")] );
lg = addLayers(lg, [
FlipLayer("flip3")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru4" )] );
lg = connectLayers(lg, "input", "flip1");
lg = connectLayers(lg, "flip2", "cat1/in2");
lg = connectLayers(lg, "cat1", "flip3");
lg = connectLayers(lg, "gru4", "cat2/in2");
options = trainingOptions('adam', 'Plots', 'training-progress');
net = trainNetwork(XTrain, YTrain, lg, options);
[XTest, YTest] = japaneseVowelsTestData;
YPred = classify(net, XTest);
accuracy = sum(YTest == YPred)/numel(YTest)
  7 Comments
Amanjit Dulai
Amanjit Dulai on 21 Jun 2022
The third dimension is the time dimension. For a bi-GRU, we want one GRU layer to operate on the sequence in the forward time direction, and then we want one GRU layer to operate on the sequence in the reverse time direction. We can get it to operate on the sequence in the reverse time direction by flipping in the third dimension.

Sign in to comment.

More Answers (5)

Ronny Guendel
Ronny Guendel on 22 Oct 2021
The full working code for me:
[XTrain, YTrain] = japaneseVowelsTrainData;
lg = layerGraph();
lg = addLayers(lg, [
sequenceInputLayer(12, "Name", "input")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru1")
concatenationLayer(1, 2, "Name", "cat1")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru3")
concatenationLayer(1, 2, "Name", "cat2")
fullyConnectedLayer(9, 'Name', 'fc')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'classoutput')] );
lg = addLayers( lg, [
FlipLayer("flip1")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru2" )
FlipLayer("flip2")] );
lg = addLayers(lg, [
FlipLayer("flip3")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru4" )] );
lg = connectLayers(lg, "input", "flip1");
lg = connectLayers(lg, "flip2", "cat1/in2");
lg = connectLayers(lg, "cat1", "flip3");
lg = connectLayers(lg, "gru4", "cat2/in2");
options = trainingOptions('adam', 'Plots', 'training-progress');
net = trainNetwork(XTrain, YTrain, lg, options);
[XTest, YTest] = japaneseVowelsTestData;
YPred = classify(net, XTest);
accuracy = sum(YTest == YPred)/numel(YTest)

义
on 22 Nov 2022
why i can't find the FlipLayer in matlab2022a?
  2 Comments
Amanjit Dulai
Amanjit Dulai on 22 Nov 2022
FlipLayer is a custom layer you need to implement. The code is shown below:
classdef FlipLayer < nnet.layer.Layer
methods
function layer = FlipLayer(name)
layer.Name = name;
end
function Y = predict(~, X)
Y = flip(X,3);
end
end
end

Sign in to comment.


义
on 25 Nov 2022
If my data is a 1*n power system load data, how should I set this parameter?

song
song on 28 Jun 2023
If my data is a 1*n bearing system load data, how should I set this parameter?

Artem Lensky
Artem Lensky on 17 Aug 2023
Edited: Artem Lensky on 17 Aug 2023
Below is a multiblock BiGRU implementation with layer normalization and dropout layers.
The models takes the following parameters
numFeatures, numBlocks, numHiddenUnits, numResponse, dropoutFactor
and the function constructing BiGRU is implemented as follows:
function lgraph = constructBiGRU(numFeatures,numBlocks,numHiddenUnits,numResponse,dropoutFactor)
arguments
numFeatures = 1,
numBlocks = 1,
numHiddenUnits = 32,
numResponse = 3,
dropoutFactor = 0,
end
layer = [sequenceInputLayer(numFeatures,'Name','sequenceInputLayer')];
lgraph = layerGraph(layer);
outputName = lgraph.Layers(end).Name;
for i = 1:numBlocks
layers = [gruLayer(numHiddenUnits, OutputMode='sequence',Name="gru_1_" + i)
concatenationLayer(1, 2, Name="cat_" + i)
dropoutLayer(dropoutFactor, Name="dropout"+ i)
layerNormalizationLayer(Name="layernorm_" + i)];
lgraph = addLayers(lgraph, layers);
layers = [FlipLayer("flip_1_" + i)
gruLayer(numHiddenUnits, OutputMode='sequence',Name="gru_2_" + i)
FlipLayer("flip_2_" + i)];
lgraph = addLayers(lgraph, layers);
lgraph = connectLayers(lgraph, outputName, "gru_1_" + i);
lgraph = connectLayers(lgraph, outputName, "flip_1_" + i);
lgraph = connectLayers(lgraph, "flip_2_" + i, "cat_" + i + "/in2");
outputName = "layernorm_" + i;
end
% Remove last dropout layer. In case the network contains a single
% BiGRU block, no dropout layers will be added at all.
lgraph = removeLayers(lgraph, "layernorm_" + i);
layers = [fullyConnectedLayer(numResponse,'Name','fc')
softmaxLayer
classificationLayer];
lgraph = addLayers(lgraph, layers);
lgraph = connectLayers(lgraph, "dropout" + i, "fc");
end

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!