LSTM time series hyperparameter optimization using bayesian optimization
99 views (last 30 days)
I am working with time series regression problem. I want to optimize the hyperparamters of LSTM using bayesian optimization. I have 3 input variables and 1 output variable.
I want to optimize the number of hidden layers, number of hidden units, mini batch size, L2 regularization and initial learning rate . Code is given below:
numFeatures = 3;
numHiddenUnits = 120;
numResponses = 1;
layers = [ ...
options = trainingOptions('adam', ...
net = trainNetwork(XTrain,YTrain,layers,options);
YPredicted = predict(net,Xval, 'MiniBatchSize',1);
valError = 1 - mean(YPredicted == Yval);
Thanks in advance.
Jorge Calvo on 5 Oct 2021
I thought you would like to know that, in R2021b, we are included an example for training long short-term memory (LSTM) networks using Bayesian optimization in Experiment Manager:
I hope you find it helpful!
Don Mathis on 10 May 2019
Here's an example using a convolutional network instead of an LSTM network. Your LSTM case should look very similar: https://www.mathworks.com/help/deeplearning/examples/deep-learning-using-bayesian-optimization.html
Jorge Calvo on 27 May 2021
If you have R2020b or later, you can use the Experiment Manager app to run Bayesian optimization to determine the best combination of hyperparameters. For more information, see https://www.mathworks.com/help/deeplearning/ug/experiment-using-bayesian-optimization.html.