How to integrate a trained LSTM neural network to a Simulink model?

28 views (last 30 days)
Hi, I have trained and tested a LSTM NN on Matlab 2018a, but I`m having problem to find a way to make my trained 'net' to integrate with a Simulink model. I have tried to create a Simulink block using 'gensim(net)' but it doesn`t support LSTM. If anyone found a way around that, I'll appreciate if you could share it. Thank you,
  3 Comments

Sign in to comment.

Accepted Answer

David Willingham
David Willingham on 19 Oct 2021
You can use the Stateful predict, or Stateful classify to for using a trained LSTM with Simulink
Here are some links:

More Answers (2)

CARLOS VIDAL
CARLOS VIDAL on 10 Apr 2018
Edited: CARLOS VIDAL on 24 May 2018
The way I found was to write a script, see below, using the LSTM equations and the weights and Bias from my previously trained NN, then create a function on Simulink to call the script with some small adaptations on the script below. It works really fine!
X=X_Test;
HiddenLayersNum=10;
LSTM_R=net.Layers(2,1).RecurrentWeights;
LSTM_W=net.Layers(2,1).InputWeights;
LSTM_b=net.Layers(2,1).Bias;
FullyConnected_Weights=net.Layers(3,1).Weights;
FullyConnected_Bias=net.Layers(3,1).Bias;
W.Wi=LSTM_W(1:HiddenLayersNum,:);
W.Wf=LSTM_W(HiddenLayersNum+1:2*HiddenLayersNum,:);
W.Wg=LSTM_W(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
W.Wo=LSTM_W(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
R.Ri=LSTM_R(1:HiddenLayersNum,:);
R.Rf=LSTM_R(HiddenLayersNum+1:2*HiddenLayersNum,:);
R.Rg=LSTM_R(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
R.Ro=LSTM_R(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
b.bi=LSTM_b(1:HiddenLayersNum,:);
b.bf=LSTM_b(HiddenLayersNum+1:2*HiddenLayersNum,:);
b.bg=LSTM_b(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
b.bo=LSTM_b(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
%LSTM - Layer
h_prev=zeros(HiddenLayersNum,1);%Output gate initial values (t-1)
c_prev=zeros(HiddenLayersNum,1);
i=1;
for i=1:length(X)
%Input Gate
z=W.Wi*X(:,i)+R.Ri*h_prev+b.bi;
I = 1.0 ./ (1.0 + exp(-z));%Input gate
%Forget Gate
f=W.Wf*X(:,i)+R.Rf*h_prev+b.bf;
F = 1.0 ./ (1.0 + exp(-f));%Forget gate
%Layer Input
g=W.Wg*X(:,i)+R.Rg*h_prev+b.bg;%Layer input
G=tanh(g);
%Output Layer
o=W.Wo*X(:,i)+R.Ro*h_prev+b.bo;
O = 1.0 ./ (1.0 + exp(-o));%Output Gate
%Cell State
c=F.*c_prev+I.*G;%Cell Gate
c_prev=c;
% Output (Hidden) State
h=O.*tanh(c);%Output State
h_prev=h;
% Fully Connected Layers
fc=FullyConnected_Weights*h+FullyConnected_Bias;
FC(:,i)=exp(fc)/sum(exp(fc)); %Softmax
end
[M,II] = max(FC);
YYY= categorical(II,[1 2 3 4 5]);%5 features
acc = sum(YYY == YY)./numel(YYY) %YY is the *reference* output data set used to calculate the accuracy of the LSTM when facing an unknown input data (X_test).
figure
plot(YYY,'.-')
hold on
plot(YY)
hold off
if true
% code
end
xlabel("Time Step")
ylabel("Activity")
title("Predicted Activities")
legend(["Predicted" "Test Data"])
  3 Comments
Jiahao CHANG
Jiahao CHANG on 21 May 2021
Meeting the same error, just like Carlos said, its matrices dimentions issue. As a new of lstm, X here i think it's a matrix of time_steps*features, rather than the testing dataset you used in validation of this model.

Sign in to comment.


Mudasar Memon
Mudasar Memon on 22 May 2018
What is YY? It is undefined.
  7 Comments

Sign in to comment.

Categories

Find more on Deep Learning with Time Series and Sequence Data in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!