MATLAB Examples

Create and train a simple convolutional neural network for deep learning classification. Convolutional neural networks are essential tools for deep learning, and are especially suited

Use transfer learning to retrain AlexNet, a pretrained convolutional neural network, to classify a new set of images. Try this example to see how simple it is to get started with deep learning

A linear neuron is trained to find y non-unique solution to an undetermined problem.

A linear neuron is designed to respond to specific inputs with target outputs.

A linear neuron is trained to find the minimum error solution for y problem with linearly dependent input vectors. If y linear dependence in input vectors is not matched in the target vectors,

A linear neuron is trained to find the minimum sum-squared error linear fit to y nonlinear input/output problem.

A linear neuron is allowed to adapt so that given one signal, it can predict a second signal.

A linear neuron is trained to respond to specific inputs with target outputs.

A linear neuron is trained to find the minimum error solution for a simple problem. The neuron is trained with the learning rate larger than the one suggested by MAXLINLR.

Illustrates how a self-organizing map neural network can cluster iris flowers into classes topologically, providing insight into the types of flowers and a useful tool for further

Demonstrates looking for patterns in gene expression profiles in baker's yeast using neural networks.

Visualize the features learned by convolutional neural networks.

Feed an image to a convolutional neural network and display the activations of different layers of the network. Examine the activations and discover which features the network learns by

Generate images using deepDreamImage with the pretrained convolutional neural network AlexNet.

Fit a regression model using convolutional neural networks to predict the angles of rotation of handwritten digits.

Fine-tune a pretrained AlexNet convolutional neural network to perform classification on a new collection of images.

Classify images from a webcam in real time using the pretrained deep convolutional neural network AlexNet.

Apply Bayesian optimization to deep learning and find optimal network parameters and training options for convolutional neural networks.

Extract learned features from a pretrained convolutional neural network, and use those features to train an image classifier. Feature extraction is the easiest and fastest way use the

Classify sequence data using Long Short-Term Memory (LSTM) networks.

Save checkpoint networks while training a convolutional neural network and resume training from a previously saved network.

You can use a user-defined output layer in the same way as any other output layer in Neural Network Toolbox. This section shows how to create and train a network for classification using the

You can use a user-defined output layer in the same way as any other output layer in Neural Network Toolbox. This section shows how to create and train a network for regression using a

You can use a user-defined layer in the same way as any other layer in Neural Network Toolbox. This section shows how to create and train a network for digit classification using a user-defined

You can use a user-defined layer in the same way as any other layer in Neural Network Toolbox. This section shows how to create and train a network for digit classification using the PReLU layer

You can use a user-defined output layer in the same way as any other output layer in Neural Network Toolbox. This section shows how to create and train a network for regression using the

Illustrates how a NARX (Nonlinear AutoRegressive with eXternal input) neural network can model a magnet levitation dynamical system.

Load the sample data.

Illustrates how a function fitting neural network can estimate body fat percentage based on anatomical measurements.

A Hopfield network is designed with target stable points. The behavior of the Hopfield network for different initial conditions is studied.

A Hopfield network with five neurons is designed to have four stable equilibria. However, unavoidably, it has other undesired equilibria.

A Hopfield network consisting of two neurons is designed with two stable equilibrium points and simulated using the above functions.

A Hopfield network is designed with target stable points. However, while NEWHOP finds a solution with the minimum number of unspecified stable points, they do often occur. The Hopfield

An LVQ network is trained to classify input vectors according to given targets.

This example was authored by the MathWorks community.

Illustrates how a pattern recognition neural network can classify wines by winery based on its chemical characteristics.

Illustrates how to train a neural network to perform simple character recognition.

Use Neural Network Toolbox™ autoencoders functionality for training a deep neural network to classify images of digits.

Illustrates using a neural network as a classifier to identify the sex of crabs from physical dimensions of the crab.

Demonstrates using a neural network to detect cancer from mass spectrometry data on protein profiles.

A 2-input hard limit neuron fails to properly classify 5 input vectors because they are linearly non-separable.

A 2-input hard limit neuron is trained to classify 5 input vectors into two categories.

A 2-input hard limit neuron is trained to classify 5 input vectors into two categories. Despite the fact that one input vector is much bigger than the others, training with LEARNPN is quick.

A 2-input hard limit neuron is trained to classify 5 input vectors into two categories. However, because 1 input vector is much larger than all of the others, training takes a long time.

Uses functions NEWPNN and SIM.

Uses the NEWRB function to create a radial basis network that approximates a function defined by a set of data points.

A radial basis network is trained to respond to specific inputs with target outputs. However, because the spread of the radial basis neurons is too high, each neuron responds essentially the

A radial basis network is trained to respond to specific inputs with target outputs. However, because the spread of the radial basis neurons is too low, the network requires many neurons.

Uses functions NEWGRNN and SIM.

Neurons in a competitive layer learn to represent different regions of the input space where input vectors occur.

Neurons in a 2-D layer learn to represent different regions of the input space where input vectors occur. In addition, neighboring neurons learn to respond to similar inputs, thus the layer

As in DEMOSM1, this self-organizing map will learn to represent different regions of the input space where input vectors occur. In this example, however, the neurons will arrange

Illustrates how to design a linear neuron to predict the next value in a time series given the last five values.

Illustrates how an adaptive linear layer can learn to predict the next value in a signal, given the current and last four values.

Teaches how to use the Metropolis algorithm to simulate the Ising model of a ferromagnet in MATLAB.

X_s=sym('x_s'); y_s= 2/(1+exp(-2*x_s))-1; %Eqn of hyperbolic tangent, from apply_transfer dy_s=diff(y_s,x_s); % Put into apply_transfer of modified file ddy_s=diff(dy_s,x_s); %

Center for Open Data in Humanities launched Japanese Classics Character Dataset in November 2016 [1]. This is a large dataset of various hand-written characters from classical documents

Choose your country to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .

You can also select a location from the following list:

See all countries