The code provides hands-on examples to implement convolutional neural networks (CNNs) for object recognition. The three demos have associated instructional videos that will allow for a complete tutorial experience to understand and implement deep learning techniques.
The demos include:
- Training a neural network from scratch
- Using a pre-trained model (transfer learning)
- Using a neural network as a feature extractor
The corresponding videos for the demos are located here: https://www.mathworks.com/videos/series/deep-learning-with-MATLAB.html
The use of a GPU and Parallel Computing Toolbox™ is recommended when running the examples. Demo 3 requires Statistics and Machine Learning Toolbox™ in addition to the required products below.
Thank you for such a great video, but I have a question.
Can we use 16 layers of AlexNet instead of using all layers it has?
And if yes, how can we do that? Can you help me please?
How can I change the codes so that it downloads CIFAR-100 from the url and prepare .mat files as image folder like how it did for CIFAR-10? I had tried on my own by manipulating the codes for the image folder saving part for CIFAR-100 related .mat file, it couldn't work. Please help.
Downloading 174MB CIFAR-10 dataset...
Error using websave (line 106)
The error "Error copying data." occurred while communicating with URL
Error in DownloadCIFAR10 (line 13)
how can i correct this error
Problem solved! -"Out of Memory on device...."
The trainNetWork() function contains opts parameter.
I depressed the 'MiniBatchSize' of opts from 128 to 64.
And also it took more time.
When i ran the file Demo_TransferLearning.mlx at trainNetwork() function, it crashed, given the information "Out of Memory on device....".
I tried gpuDevice(1) statement, but it occurred again.
My GPU is GeForce GTX 965M with 4GB.
Should i config some parameters in Matlab?
In the program demo_TrainingFromScratch:
fc1 = fullyConnectedLayer(64,'BiasLearnRateFactor',2);
In the above code how are 64 neurons chosen in the fully connected layer.
fc1.Weights = single(randn([64 576])*0.1);
Also what does 576 indicate in the above code.
confMat = confusionmat(imds_test.Labels, labels);
confMat = confMat./sum(confMat,2);
when i am runnig this part of code i got this massege(error using ./ matrix dimensions must agree ) what should i change to make it right
very helpful. I hope there will be something on regression in the future.
Is there any specific requirement for minimum version of MATLAB ? I am having errors saying undefined function, may be my MATLAB version don't has that function implemented.
I think, you need to change convnet to net Demo_FeatureExtraction.
thanks for your video. I have one question. For running this simulation, DO I need to have GPU on my computer?
thanks for so excellent video. I have a question for the transfer learning demo. In Matlab, all layers except the last three(any other number) are extracted from the pretrained network, and the last three layers are replaced by new one. There are a few finetune ways, such as finetune the whole network, finetune the last classifier layer, or finetune from any specificed layer. So, I wonder what is the finetune ways in Matlab. finetune the whole network or finetune the last classifier layer? the trainNetwork function implements the retrain( finetune the whole network) ?
I am sorry, THIS is the error I am getting:
Undefined function or variable 'convnet'.
minor bug fix in third file, "Demo_FeatureExtraction.mlx" :
+ Fixed typo in code.