Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

SeriesNetwork

Series network for deep learning

Description

A series network is a neural network for deep learning with layers arranged one after the other. It has a single input layer and a single output layer.

Creation

There are several ways to create a SeriesNetwork:

Properties

expand all

Network layers, specified as a Layer array.

Object Functions

activationsCompute convolutional neural network layer activations
classifyClassify data using a trained deep learning neural network
predictPredict responses using a trained deep learning neural network
predictAndUpdateStatePredict responses using a trained recurrent neural network and update the network state
classifyAndUpdateStateClassify data using a trained recurrent neural network and update the network state
resetStateReset the state of a recurrent neural network

Examples

expand all

Load a pretrained AlexNet convolutional neural network and examine the layers and classes.

Load the pretrained AlexNet network using alexnet. The output net is a SeriesNetwork object.

net = alexnet
net = 

  SeriesNetwork with properties:

    Layers: [25×1 nnet.cnn.layer.Layer]

Using the Layers property, view the network architecture. The network comprises of 25 layers. There are 8 layers with learnable weights: 5 convolutional layers, and 3 fully connected layers.

net.Layers
ans = 

  25x1 Layer array with layers:

     1   'data'     Image Input                   227x227x3 images with 'zerocenter' normalization
     2   'conv1'    Convolution                   96 11x11x3 convolutions with stride [4  4] and padding [0  0]
     3   'relu1'    ReLU                          ReLU
     4   'norm1'    Cross Channel Normalization   cross channel normalization with 5 channels per element
     5   'pool1'    Max Pooling                   3x3 max pooling with stride [2  2] and padding [0  0]
     6   'conv2'    Convolution                   256 5x5x48 convolutions with stride [1  1] and padding [2  2]
     7   'relu2'    ReLU                          ReLU
     8   'norm2'    Cross Channel Normalization   cross channel normalization with 5 channels per element
     9   'pool2'    Max Pooling                   3x3 max pooling with stride [2  2] and padding [0  0]
    10   'conv3'    Convolution                   384 3x3x256 convolutions with stride [1  1] and padding [1  1]
    11   'relu3'    ReLU                          ReLU
    12   'conv4'    Convolution                   384 3x3x192 convolutions with stride [1  1] and padding [1  1]
    13   'relu4'    ReLU                          ReLU
    14   'conv5'    Convolution                   256 3x3x192 convolutions with stride [1  1] and padding [1  1]
    15   'relu5'    ReLU                          ReLU
    16   'pool5'    Max Pooling                   3x3 max pooling with stride [2  2] and padding [0  0]
    17   'fc6'      Fully Connected               4096 fully connected layer
    18   'relu6'    ReLU                          ReLU
    19   'drop6'    Dropout                       50% dropout
    20   'fc7'      Fully Connected               4096 fully connected layer
    21   'relu7'    ReLU                          ReLU
    22   'drop7'    Dropout                       50% dropout
    23   'fc8'      Fully Connected               1000 fully connected layer
    24   'prob'     Softmax                       softmax
    25   'output'   Classification Output         crossentropyex with 'tench', 'goldfish', and 998 other classes

You can view the names of the classes learned by the network by viewing the ClassNames property of the classification output layer (the final layer). View the first 10 classes by selecting the first 10 elements.

net.Layers(end).ClassNames(1:10)
ans =

  1×10 cell array

  Columns 1 through 4

    'tench'    'goldfish'    'great white shark'    'tiger shark'

  Columns 5 through 9

    'hammerhead'    'electric ray'    'stingray'    'cock'    'hen'

  Column 10

    'ostrich'

Specify the example file 'digitsnet.prototxt' to import.

protofile = 'digitsnet.prototxt';

Import the network layers.

layers = importCaffeLayers(protofile)
layers = 

  1x7 Layer array with layers:

     1   'testdata'   Image Input             28x28x1 images
     2   'conv1'      Convolution             20 5x5x1 convolutions with stride [1  1] and padding [0  0]
     3   'relu1'      ReLU                    ReLU
     4   'pool1'      Max Pooling             2x2 max pooling with stride [2  2] and padding [0  0]
     5   'ip1'        Fully Connected         10 fully connected layer
     6   'loss'       Softmax                 softmax
     7   'output'     Classification Output   crossentropyex with 'class1', 'class2', and 8 other classes

Create a convolutional neural network for classification.

Load the sample data.

[XTrain,TTrain] = digitTrain4DArrayData;

digitTrain4DArrayData loads the digit training set as 4-D array data. XTrain is a 28-by-28-by-1-by-5000 array, where 28 is the height and 28 is the width of the images. The number of channels is 1, and the number of synthetic images of handwritten digits is 5000. TTrain is a categorical vector containing the labels for each observation.

Construct the convolutional neural network architecture.

layers = [...
    imageInputLayer([28 28 1])
    convolution2dLayer(5,20)
    reluLayer
    maxPooling2dLayer(2,'Stride',2)
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer];

Set the options to the default settings for the stochastic gradient descent with momentum. Set 'Verbose' to false to suppress detailed output on the training progress.

options = trainingOptions('sgdm','Verbose',false);

Train the network.

net = trainNetwork(XTrain,TTrain,layers,options);

Run the trained network on a test set and predict the image labels (digits).

[XTest,TTest] = digitTest4DArrayData;
YTest = classify(net,XTest);

Calculate the accuracy.

accuracy = sum(YTest==TTest)/numel(TTest)
accuracy = 0.9770

Introduced in R2016a

Was this topic helpful?