I've made a simple neural network
It classifies MNIST handwritten digit using fully-connected layers
lgraph_2 = [ ...
imageInputLayer([28 28 1])
And the options in the neural network is
miniBatchSize = 10;
valFrequency = 5;
options = trainingOptions('sgdm', ...
I expected when i use a GPU, it's training speed will be high
But when I train this network using Macbook(sigle CPU)
it takes 1 hour for around 2500 iterations
And when I use my desktop using RTX 2080Ti,
It takes much longer time to train.
MATLAB detects my GPU properly(I checked the GPU information using gpuDevice)
I don't know how can I accelerate the training proess.
Thank you in advance