This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.





Perceptrons are simple single-layer binary classifiers, which divide the input space with a linear decision boundary.

Perceptrons can learn to solve a narrow range of classification problems. They were one of the first neural networks to reliably solve a given class of problem, and their advantage is a simple learning rule.

perceptron(hardlimitTF,perceptronLF) takes these arguments,


Hard limit transfer function (default = 'hardlim')


Perceptron learning rule (default = 'learnp')

and returns a perceptron.

In addition to the default hard limit transfer function, perceptrons can be created with the hardlims transfer function. The other option for the perceptron learning rule is learnpn.


Neural Network Toolbox™ supports perceptrons for historical interest. For better results, you should instead use patternnet, which can solve nonlinearly separable problems. Sometimes the term “perceptrons” refers to feed-forward pattern recognition networks; but the original perceptron, described here, can solve only simple problems.


Solve Simple Classification Problem Using Perceptron

Use a perceptron to solve a simple classification logical-OR problem.

x = [0 0 1 1; 0 1 0 1];
t = [0 1 1 1];
net = perceptron;
net = train(net,x,t);
y = net(x);

Introduced in R2010b

Was this topic helpful?