This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Neural Network Toolbox Functions

Alphabetical List By Category
activations Compute convolutional neural network layer activations
adapt Adapt neural network to data as it is simulated
adaptwb Adapt network with weight and bias learning rules
adddelay Add delay to neural network response
alexnet Pretrained AlexNet convolutional neural network
Autoencoder Autoencoder class
AveragePooling2DLayer Average pooling layer object
averagePooling2dLayer Average pooling layer object
boxdist Distance between two position vectors
cascadeforwardnet Cascade-forward neural network
catelements Concatenate neural network data elements
catsamples Concatenate neural network data samples
catsignals Concatenate neural network data signals
cattimesteps Concatenate neural network data timesteps
classificationLayer Create a classification output layer
ClassificationOutputLayer Classification output layer
classify Classify data using a trained convolutional neural network
closeloop Convert neural network open-loop feedback to closed loop
combvec Create all combinations of vectors
compet Competitive transfer function
competlayer Competitive layer
con2seq Convert concurrent vectors to sequential vectors
configure Configure network inputs and outputs to best match input and target data
confusion Classification confusion matrix
Convolution2DLayer Convolutional layer
convolution2dLayer Convolutional layer
convwf Convolution weight function
CrossChannelNormalizationLayer Channel-wise local response normalization layer
crossChannelNormalizationLayer Channel-wise local response normalization layer
crossentropy Neural network performance
decode Decode encoded data
deepDreamImage Visualize network features using deep dream
disp Neural network properties
display Name and properties of neural network variables
dist Euclidean distance weight function
distdelaynet Distributed delay network
divideblock Divide targets into three sets using blocks of indices
divideind Divide targets into three sets using specified indices
divideint Divide targets into three sets using interleaved indices
dividerand Divide targets into three sets using random indices
dividetrain Assign all targets to training set
dotprod Dot product weight function
DropoutLayer Dropout layer
dropoutLayer Dropout layer
elliot2sig Elliot 2 symmetric sigmoid transfer function
elliotsig Elliot symmetric sigmoid transfer function
elmannet Elman neural network
encode Encode input data
errsurf Error surface of single-input neuron
extendts Extend time series data to given number of timesteps
feedforwardnet Feedforward neural network
fitnet Function fitting neural network
fixunknowns Process data by marking rows with unknown values
formwb Form bias and weights into single vector
fromnndata Convert data from standard neural network cell array form
FullyConnectedLayer Fully connected layer
fullyConnectedLayer Fully connected layer
gadd Generalized addition
gdivide Generalized division
generateFunction Generate a MATLAB function to run the autoencoder
generateSimulink Generate a Simulink model for the autoencoder
genFunction Generate MATLAB function for simulating neural network
gensim Generate Simulink block for neural network simulation
getelements Get neural network data elements
getsamples Get neural network data samples
getsignals Get neural network data signals
getsiminit Get Simulink neural network block initial input and layer delays states
gettimesteps Get neural network data timesteps
getwb Get network weight and bias values as single vector
gmultiply Generalized multiplication
gnegate Generalized negation
gridtop Grid layer topology function
gsqrt Generalized square root
gsubtract Generalized subtraction
hardlim Hard-limit transfer function
hardlims Symmetric hard-limit transfer function
hextop Hexagonal layer topology function
ImageInputLayer Image input layer
imageInputLayer Image input layer
importCaffeLayers Import convolutional neural network layers from Caffe
importCaffeNetwork Import pretrained convolutional neural network models from Caffe
ind2vec Convert indices to vectors
init Initialize neural network
initcon Conscience bias initialization function
initlay Layer-by-layer network initialization function
initlvq LVQ weight initialization function
initnw Nguyen-Widrow layer initialization function
initwb By weight and bias layer initialization function
initzero Zero weight and bias initialization function
isconfigured Indicate if network inputs and outputs are configured
Layer Network layer
layrecnet Layer recurrent neural network
learncon Conscience bias learning function
learngd Gradient descent weight and bias learning function
learngdm Gradient descent with momentum weight and bias learning function
learnh Hebb weight learning rule
learnhd Hebb with decay weight learning rule
learnis Instar weight learning function
learnk Kohonen weight learning function
learnlv1 LVQ1 weight learning function
learnlv2 LVQ2.1 weight learning function
learnos Outstar weight learning function
learnp Perceptron weight and bias learning function
learnpn Normalized perceptron weight and bias learning function
learnsom Self-organizing map weight learning function
learnsomb Batch self-organizing map weight learning function
learnwh Widrow-Hoff weight/bias learning function
linearlayer Linear layer
linkdist Link distance function
logsig Log-sigmoid transfer function
lvqnet Learning vector quantization neural network
lvqoutputs LVQ outputs processing function
mae Mean absolute error performance function
mandist Manhattan distance weight function
mapminmax Process matrices by mapping row minimum and maximum values to [-1 1]
mapstd Process matrices by mapping each row's means to 0 and deviations to 1
maxlinlr Maximum learning rate for linear layer
MaxPooling2DLayer Max pooling layer
maxPooling2dLayer Max pooling layer
meanabs Mean of absolute elements of matrix or matrices
meansqr Mean of squared elements of matrix or matrices
midpoint Midpoint weight initialization function
minmax Ranges of matrix rows
mse Mean squared normalized error performance function
narnet Nonlinear autoregressive neural network
narxnet Nonlinear autoregressive neural network with external input
negdist Negative distance weight function
netinv Inverse transfer function
netprod Product net input function
netsum Sum net input function
network Convert Autoencoder object into network object
network Create custom neural network
newgrnn Design generalized regression neural network
newlind Design linear layer
newpnn Design probabilistic neural network
newrb Design radial basis network
newrbe Design exact radial basis network
nncell2mat Combine neural network cell data into matrix
nncorr Crross correlation between neural network time series
nndata Create neural network data
nndata2sim Convert neural network data to Simulink time series
nnsize Number of neural data elements, samples, timesteps, and signals
nnstart Neural network getting started GUI
nntraintool Neural network training tool
normc Normalize columns of matrix
normprod Normalized dot product weight function
normr Normalize rows of matrix
numelements Number of elements in neural network data
numfinite Number of finite values in neural network data
numnan Number of NaN values in neural network data
numsamples Number of samples in neural network data
numsignals Number of signals in neural network data
numtimesteps Number of time steps in neural network data
openloop Convert neural network closed-loop feedback to open loop
patternnet Pattern recognition network
perceptron Perceptron
perform Calculate network performance
plotconfusion Plot classification confusion matrix
plotep Plot weight-bias position on error surface
ploterrcorr Plot autocorrelation of error time series
ploterrhist Plot error histogram
plotes Plot error surface of single-input neuron
plotfit Plot function fit
plotinerrcorr Plot input to error time-series cross-correlation
plotpc Plot classification line on perceptron vector plot
plotperform Plot network performance
plotpv Plot perceptron input/target vectors
plotregression Plot linear regression
plotresponse Plot dynamic network time series response
plotroc Plot receiver operating characteristic
plotsomhits Plot self-organizing map sample hits
plotsomnc Plot self-organizing map neighbor connections
plotsomnd Plot self-organizing map neighbor distances
plotsomplanes Plot self-organizing map weight planes
plotsompos Plot self-organizing map weight positions
plotsomtop Plot self-organizing map topology
plottrainstate Plot training state values
plotv Plot vectors as lines from origin
plotvec Plot vectors with different colors
plotwb Plot Hinton diagram of weight and bias values
plotWeights Plot a visualization of the weights for the encoder of an autoencoder
pnormc Pseudonormalize columns of matrix
poslin Positive linear transfer function
predict Predict responses using a trained convolutional neural network
predict Reconstruct the inputs using trained autoencoder
preparets Prepare input and target time series data for network simulation or training
processpca Process columns of matrix with principal component analysis
prune Delete neural inputs, layers, and outputs with sizes of zero
prunedata Prune data for consistency with pruned network
purelin Linear transfer function
quant Discretize values as multiples of quantity
radbas Radial basis transfer function
radbasn Normalized radial basis transfer function
randnc Normalized column weight initialization function
randnr Normalized row weight initialization function
rands Symmetric random weight/bias initialization function
randsmall Small random weight/bias initialization function
randtop Random layer topology function
regression Linear regression
regressionLayer Create a regression output layer
RegressionOutputLayer Regression output layer
ReLULayer Rectified Linear Unit (ReLU) layer
reluLayer Rectified Linear Unit (ReLU) layer
removeconstantrows Process matrices by removing rows with constant values
removedelay Remove delay to neural network's response
removerows Process matrices by removing rows with specified indices
roc Receiver operating characteristic
sae Sum absolute error performance function
satlin Saturating linear transfer function
satlins Symmetric saturating linear transfer function
scalprod Scalar product weight function
selforgmap Self-organizing map
separatewb Separate biases and weight values from weight/bias vector
seq2con Convert sequential vectors to concurrent vectors
SeriesNetwork Series network class
setelements Set neural network data elements
setsamples Set neural network data samples
setsignals Set neural network data signals
setsiminit Set neural network Simulink block initial conditions
settimesteps Set neural network data timesteps
setwb Set all network weight and bias values with single vector
sim Simulate neural network
sim2nndata Convert Simulink time series to neural network data
softmax Soft max transfer function
SoftmaxLayer Softmax layer for convolutional neural networks
softmaxLayer Softmax layer for convolutional neural networks
srchbac 1-D minimization using backtracking
srchbre 1-D interval location using Brent's method
srchcha 1-D minimization using Charalambous' method
srchgol 1-D minimization using golden section search
srchhyb 1-D minimization using a hybrid bisection-cubic search
sse Sum squared error performance function
stack Stack encoders from several autoencoders together
sumabs Sum of absolute elements of matrix or matrices
sumsqr Sum of squared elements of matrix or matrices
tansig Hyperbolic tangent sigmoid transfer function
tapdelay Shift neural network time series data for tap delay
timedelaynet Time delay neural network
tonndata Convert data to standard neural network cell array form
train Train neural network
trainAutoencoder Train an autoencoder
trainb Batch training with weight and bias learning rules
trainbfg BFGS quasi-Newton backpropagation
trainbr Bayesian regularization backpropagation
trainbu Batch unsupervised weight/bias training
trainc Cyclical order weight/bias training
traincgb Conjugate gradient backpropagation with Powell-Beale restarts
traincgf Conjugate gradient backpropagation with Fletcher-Reeves updates
traincgp Conjugate gradient backpropagation with Polak-Ribiére updates
traingd Gradient descent backpropagation
traingda Gradient descent with adaptive learning rate backpropagation
traingdm Gradient descent with momentum backpropagation
traingdx Gradient descent with momentum and adaptive learning rate backpropagation
trainingOptions Options for training neural network
TrainingOptionsSGDM Training options for stochastic gradient descent with momentum
trainlm Levenberg-Marquardt backpropagation
trainNetwork Train a convolutional network
trainoss One-step secant backpropagation
trainr Random order incremental training with learning functions
trainrp Resilient backpropagation
trainru Unsupervised random order weight/bias training
trains Sequential order incremental training with learning functions
trainscg Scaled conjugate gradient backpropagation
trainSoftmaxLayer Train a softmax layer for classification
tribas Triangular basis transfer function
tritop Triangle layer topology function
unconfigure Unconfigure network inputs and outputs
vec2ind Convert vectors to indices
vgg16 Pretrained VGG-16 convolutional neural network
vgg19 Pretrained VGG-19 convolutional neural network
view View autoencoder
view View neural network
Was this topic helpful?