MATLAB Examples

Classify an image using the pretrained deep convolutional neural network GoogLeNet.

Create and train a simple convolutional neural network for deep learning classification. Convolutional neural networks are essential tools for deep learning, and are especially suited

Use transfer learning to retrain AlexNet, a pretrained convolutional neural network, to classify a new set of images. Try this example to see how simple it is to get started with deep learning

Illustrates how a self-organizing map neural network can cluster iris flowers into classes topologically, providing insight into the types of flowers and a useful tool for further

Demonstrates looking for patterns in gene expression profiles in baker's yeast using neural networks.

Fine-tune a pretrained AlexNet convolutional neural network to perform classification on a new collection of images.

Extract learned image features from a pretrained convolutional neural network, and use those features to train an image classifier. Feature extraction is the easiest and fastest way use

Use transfer learning to retrain GoogLeNet, a pretrained convolutional neural network, to classify a new set of images.

Create a deep learning neural network with residual connections and train it on CIFAR-10 data. Residual connections is a popular element of convolutional neural network architectures

Fit a regression model using convolutional neural networks to predict the angles of rotation of handwritten digits.

Classify images from a webcam in real time using the pretrained deep convolutional neural network AlexNet.

Create a simple directed acyclic graph (DAG) network for deep learning. Train the network to classify images of digits. The simple network in this example consists of:

Save checkpoint networks while training a convolutional neural network and resume training from a previously saved network.

You can use a user-defined layer in the same way as any other layer in Neural Network Toolbox. This section shows how to create and train a network for digit classification using the PReLU layer

You can use a custom output layer in the same way as any other output layer in Neural Network Toolbox. This section shows how to create and train a network for classification using the custom

Check the layer validity of the example layer examplePreluLayer .

You can use your layer in the same way as any other layer in Neural Network Toolbox.

You can use a user-defined output layer in the same way as any other output layer in Neural Network Toolbox. This section shows how to create and train a network for regression using a

You can use a custom output layer in the same way as any other output layer in Neural Network Toolbox. This section shows how to create and train a network for regression using the custom output

Visualize the features learned by convolutional neural networks.

Apply Bayesian optimization to deep learning and find optimal network parameters and training options for convolutional neural networks.

Define an output function that runs at each iteration during training of deep learning neural networks. If you specify output functions by using the 'OutputFcn' name-value pair of

Generate images using deepDreamImage with the pretrained convolutional neural network AlexNet.

When you train networks for deep learning, it is often useful to monitor the training progress. By plotting various metrics during training, you can learn how the training is progressing.

Feed an image to a convolutional neural network and display the activations of different layers of the network. Examine the activations and discover which features the network learns by

Upload your data to an Amazon S3 bucket. Before you can perform deep learning training in the cloud, you need to upload your data to the cloud. The example shows how to download the CIFAR-10 data

Send deep learning training batch jobs to a cluster so that you can continue working or close MATLAB during training. Training deep neural networks often takes a lot of time. This example

Use a parfor loop to perform a parameter sweep on a training option. Deep Learning training often takes hours or days, and searching for good training options can be difficult. You can use

Use parfeval for a parameter sweep on the depth of the network architecture. Deep Learning training often takes hours or days, and searching for good architectures can be difficult. You can

Train a convolutional neural network on CIFAR-10 using MATLAB's built-in support for parallel training. Deep Learning training often takes hours or days. You can use parallel computing to

Predict the remaining useful life (RUL) of engines by using deep learning.

Classify each time step of sequence data using a long short-term memory (LSTM) network.

Train a simple deep learning model that detects the presence of speech commands in audio. The example uses the Speech Commands Dataset [1] to train a convolutional neural network to

Classify sequence data using a long short-term memory (LSTM) network.

Forecast time series data using a long short-term memory (LSTM) network.

Illustrates how a NARX (Nonlinear AutoRegressive with eXternal input) neural network can model a magnet levitation dynamical system.

Load the sample data.

Illustrates how a function fitting neural network can estimate body fat percentage based on anatomical measurements.

An LVQ network is trained to classify input vectors according to given targets.

This example was authored by the MathWorks community.

Illustrates how a pattern recognition neural network can classify wines by winery based on its chemical characteristics.

Illustrates how to train a neural network to perform simple character recognition.

Use Neural Network Toolbox™ autoencoders functionality for training a deep neural network to classify images of digits.

Illustrates using a neural network as a classifier to identify the sex of crabs from physical dimensions of the crab.

Demonstrates using a neural network to detect cancer from mass spectrometry data on protein profiles.

A 2-input hard limit neuron fails to properly classify 5 input vectors because they are linearly non-separable.

A 2-input hard limit neuron is trained to classify 5 input vectors into two categories.

A 2-input hard limit neuron is trained to classify 5 input vectors into two categories. Despite the fact that one input vector is much bigger than the others, training with LEARNPN is quick.

A 2-input hard limit neuron is trained to classify 5 input vectors into two categories. However, because 1 input vector is much larger than all of the others, training takes a long time.

Uses functions NEWPNN and SIM.

Uses the NEWRB function to create a radial basis network that approximates a function defined by a set of data points.

A radial basis network is trained to respond to specific inputs with target outputs. However, because the spread of the radial basis neurons is too high, each neuron responds essentially the

A radial basis network is trained to respond to specific inputs with target outputs. However, because the spread of the radial basis neurons is too low, the network requires many neurons.

Uses functions NEWGRNN and SIM.

Neurons in a competitive layer learn to represent different regions of the input space where input vectors occur.

Neurons in a 2-D layer learn to represent different regions of the input space where input vectors occur. In addition, neighboring neurons learn to respond to similar inputs, thus the layer

As in one-dimensional problems, this self-organizing map will learn to represent different regions of the input space where input vectors occur. In this example, however, the neurons will

Illustrates how to design a linear neuron to predict the next value in a time series given the last five values.

Illustrates how an adaptive linear layer can learn to predict the next value in a signal, given the current and last four values.

Teaches how to use the Metropolis algorithm to simulate the Ising model of a ferromagnet in MATLAB.

Center for Open Data in Humanities launched Japanese Classics Character Dataset in November 2016 [1]. This is a large dataset of various hand-written characters from classical documents

X_s=sym('x_s'); y_s= 2/(1+exp(-2*x_s))-1; %Eqn of hyperbolic tangent, from apply_transfer dy_s=diff(y_s,x_s); % Put into apply_transfer of modified file ddy_s=diff(dy_s,x_s); %

Choose your country to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .

You can also select a location from the following list:

See all countries