- The X, XTest and Y matrices are arranged in format (features, timesteps, samples) which is the expected input format for the LSTM layer.
- The “permute” function is removed from the “trainNetwork” function arguments as it was unnecessary.
- The data was reshaped to fit the expected input format for “trainNetwork” function.
Multidimensional input to SequenceInputLayer
13 views (last 30 days)
Show older comments
Hello dear community,
i am trying to train a neural network for a regression task.
I do have different cases of data though and the model should be able to differentiate between the different data sets.
For the training, i therefore thought, that i have Data with 2 Channels (2 input neurons), 3 Batches (3 different operating points to be trained) and 50 T (50 timesteps).
Therefore i would have a multi dimensional input matrix.
The target matrix would have one output, 3 Batches and also 50 timesteps.
However, i keep on getting the error:
Error using trainNetwork (line 191)
The training sequences are of feature dimension 2 50 but the input layer expects sequences of feature dimension 2.
Error in delme2 (line 46)
net = trainNetwork(permute(X, [1 3 2]), permute(Y, [1 3 2]), layers, options);
I have checked the internet, but i don't really find a solution to this.
Thank you in Advance !
Here is the Code
% Manuell definierte Dimensionen
B = 3; % Anzahl der Arbeitspunkte (Samples)
T = 50; % Anzahl der Zeitschritte
C = 2; % Anzahl der Eingangsmerkmale
O = 1; % Anzahl der Ausgabemerkmale
% Beispielhafte Daten für verschiedene Arbeitspunkte (APs)
X = zeros(C, B, T);
Y = zeros(O, B, T);
% Arbeitspunkt 1
X(1, 1, :) = 2000;
X(2, 1, :) = 70;
Y(1, 1, :) = rand(1, 1, T) * 100 + 50;
% Arbeitspunkt 2
X(1, 2, :) = 4000;
X(2, 2, :) = 50;
Y(1, 2, :) = rand(1, 1, T) * 100 + 40;
% Arbeitspunkt 3
X(1, 3, :) = 3000;
X(2, 3, :) = 60;
Y(1, 3, :) = rand(1, 1, T) * 100 + 45;
% Netzwerkparameter
numHiddenUnits = 50; % Anzahl der LSTM-Zellen
% Netzwerkarchitektur definieren
layers = [ ...
sequenceInputLayer(C) % C Eingangsmerkmale
lstmLayer(numHiddenUnits, 'OutputMode', 'sequence') % LSTM Layer
fullyConnectedLayer(O) % O Ausgabemerkmale
regressionLayer]; % Regressions-Layer für kontinuierliche Ausgaben
% Trainingsoptionen festlegen
options = trainingOptions('adam', ...
'MaxEpochs', 100, ...
'MiniBatchSize', 1, ...
'GradientThreshold', 1, ...
'InitialLearnRate', 0.01, ...
'Verbose', false, ...
'Plots', 'training-progress');
% Netz trainieren
net = trainNetwork(permute(X, [1 3 2]), permute(Y, [1 3 2]), layers, options);
% Permutieren, um X und Y in die erwartete Form C x T x B und O x T x B zu bringen
% Beispielhafte Inferenz für einen neuen Arbeitspunkt
XTest = zeros(C, 1, T); % Testdaten in der Form C x 1 x T
XTest(1, 1, :) = 3500;
XTest(2, 1, :) = 65;
% Kurve vorhersagen
YPred = predict(net, permute(XTest, [1 3 2])); % Permutieren, um XTest in die erwartete Form C x T x 1 zu bringen
% Vorhergesagte Temperaturkurve anzeigen
plot(squeeze(YPred)); % Squeeze, um die 2D-Kurve für das Plotting zu erhalten
xlabel('Zeitschritte');
ylabel('Vorhergesagte Temperatur');
title('Vorhergesagte Temperaturkurve für neuen Arbeitspunkt');
0 Comments
Accepted Answer
Sahas
on 6 Sep 2024
As per my understanding, you are training a neural network for a regression task and encountering an error when using the “trainNetwork” function.
I was able to reproduce the error at my end and identified the issue. There was a mismatch between the expected input dimensions of the network and the actual dimensions of the data being fed into it.
I made the following changes in the code to ensure smooth data processing and handle dimension mismatching:
% Manually defined dimensions
B = 3; % Number of operating points (samples)
T = 50; % Number of time steps
C = 2; % Number of input features
O = 1; % Number of output features
%%%%%Changes here to match the expected input format%%%%%
% Example data for different operating points (OPs)
X = zeros(C, T, B);
Y = zeros(O, T, B);
% Operating point 1
X(1, :, 1) = 2000;
X(2, :, 1) = 70;
Y(1, :, 1) = rand(1, T) * 100 + 50;
% Operating point 2
X(1, :, 2) = 4000;
X(2, :, 2) = 50;
Y(1, :, 2) = rand(1, T) * 100 + 40;
% Operating point 3
X(1, :, 3) = 3000;
X(2, :, 3) = 60;
Y(1, :, 3) = rand(1, T) * 100 + 45;
% Network parameters
numHiddenUnits = 50; % Number of LSTM cells
% Define network architecture
layers = [ ...
sequenceInputLayer(C) % C input features
lstmLayer(numHiddenUnits, 'OutputMode', 'sequence') % LSTM layer
fullyConnectedLayer(O) % O output features
regressionLayer]; % Regression layer for continuous outputs
% Set training options
options = trainingOptions('adam', ...
'MaxEpochs', 100, ...
'MiniBatchSize', 1, ...
'GradientThreshold', 1, ...
'InitialLearnRate', 0.01, ...
'Verbose', false, ...
'Plots', 'training-progress');
% Reshape data to fit the expected input format for trainNetwork
% Each sequence is of size B, with cell of size C x T
XCell = squeeze(mat2cell(X, C, T, ones(1, B)));
YCell = squeeze(mat2cell(Y, O, T, ones(1, B)));
% Train network
net = trainNetwork(XCell, YCell, layers, options);
%%%%%Changes here to match the expected output format%%%%%
% Example inference for a new operating point
XTest = zeros(C, T, 1); % Test data in the form C x T x 1
XTest(1, :, 1) = 3500;
XTest(2, :, 1) = 65;
% Predict curve
YPred = predict(net, squeeze(XTest));
% Display predicted temperature curve
plot(squeeze(YPred)); % Squeeze to obtain the 2D curve for plotting
xlabel('Time Steps');
ylabel('Predicted Temperature');
title('Predicted Temperature Curve for New Operating Point');
Refer to the following MathWorks documentation on “trainNetwork” for more information:
Hope this is beneficial!
0 Comments
More Answers (0)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!