This example shows how to use transfer learning to retrain SqueezeNet, a pretrained convolutional neural network, to classify a new set of images. Try this example to see how simple it is to get started with deep learning in MATLAB®.
Transfer learning is commonly used in deep learning applications. You can take a pretrained network and use it as a starting point to learn a new task. Fine-tuning a network with transfer learning is usually much faster and easier than training a network with randomly initialized weights from scratch. You can quickly transfer learned features to a new task using a smaller number of training images.
In the workspace, extract the MathWorks Merch data set. This is a small data set containing 75 images of MathWorks merchandise, belonging to five different classes (cap, cube, playing cards, screwdriver, and torch).
Open Deep Network Designer.
Select SqueezeNet from the list of pretrained networks and click Open.
Deep Network Designer displays a zoomed-out view of the whole network.
Explore the network plot. To zoom in with the mouse, use Ctrl+scroll wheel. To pan, use the arrow keys, or hold down the scroll wheel and drag the mouse. Select a layer to view its properties. Deselect all layers to view the network summary in the Properties pane.
To load the data into Deep Network Designer, on the Data tab, click Import Data > Import Image Data. The Import Image Data dialog box opens.
In the Data source list, select Folder. Click Browse and select the extracted MerchData folder.
Divide the data into 70% training data and 30% validation data.
Specify augmentation operations to perform on the training images. Data augmentation helps prevent the network from overfitting and memorizing the exact details of the training images. For this example, apply a random reflection in the x-axis, a random rotation from the range [-90,90] degrees, and a random rescaling from the range [1,2].
Click Import to import the data into Deep Network Designer.
To retrain SqueezeNet to classify new images, replace the last 2-D convolutional layer and the final classification layer of the network. In SqueezeNet, these layers have the names
On the Designer pane, drag a new
convolution2dLayer onto the canvas. To match the original convolutional layer, set
NumFilters to be the number of classes in the new data, in this example,
Change the learning rates so that learning is faster in the new layer than in the transferred layers by setting
Delete the last 2-D convolutional layer and connect your new layer instead.
Replace the output layer. Scroll to the end of the Layer Library and drag a new
classificationLayer onto the canvas. Delete the original output layer and connect your new layer in its place.
To choose the training options, select the Training tab and click Training Options. Set the initial learn rate to a small value to slow down learning in the transferred layers. In the previous step, you increased the learning rate factors for the 2-D convolutional layer to speed up learning in the new final layers. This combination of learning rate settings results in fast learning only in the new layers and slower learning in the other layers.
For this example, set InitialLearnRate to
0.0001, ValidationFrequency to
5, MaxEpochs to
8. As there are 55 observations, set MiniBatchSize to
11 to divide the training data evenly and ensure the whole training set is used during each epoch.
To train the network with the specified training options, click Close and then click Train.
Deep Network Designer allows you to visualize and monitor the training progress. You can then edit the training options and retrain the network, if required.
To export the results from training, on the Training tab, select Export > Export Trained Network and Results. Deep Network Designer exports the trained network as the variable
trainedNetwork_1 and the training info as the variable
You can also generate MATLAB code, which recreates the network and the training options used. On the Training tab, select Export > Generate Code for Training. Examine the MATLAB code to learn how to programmatically prepare the data for training, create the network architecture, and train the network.
Load a new image to classify using the trained network.
I = imread("MerchDataTest.jpg");
Resize the test image to match the network input size.
I = imresize(I, [227 227]);
Classify the test image using the trained network.
[YPred,probs] = classify(trainedNetwork_1,I); imshow(I) label = YPred; title(string(label) + ", " + num2str(100*max(probs),3) + "%");
 ImageNet. http://www.image-net.org
 Iandola, Forrest N., Song Han, Matthew W. Moskewicz, Khalid Ashraf, William J. Dally, and Kurt Keutzer. "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5 MB model size." Preprint, submitted November 4, 2016. https://arxiv.org/abs/1602.07360.
 Iandola, Forrest N. "SqueezeNet." https://github.com/forresti/SqueezeNet.