Neural Network Toolbox |
Syntax
Description
Perceptrons are used to solve simple (i.e. linearly separable) classification problems.
net = newp
creates a new network with a dialog box.
net = newp(PR,S,TF,LF)
takes these inputs,
PR
-- R
x 2
matrix of min and max values for R
input elements
The transfer function TF
can be hardlim
or hardlims
. The learning function LF
can be learnp
or learnpn
.
Properties
Perceptrons consist of a single layer with the dotprod
weight function, the netsum
net input function, and the specified transfer function.
The layer has a weight from the input and a bias.
Weights and biases are initialized with initzero
.
Adaption and training are done with trains
and trainc
, which both update weight and bias values with the specified learning function. Performance is measured with mae
.
Examples
This code creates a perceptron layer with one two-element input (ranges [0 1] and [-2 2]) and one neuron. (Supplying only two arguments to newp
results in the default perceptron learning function learnp
being used.)
Here we simulate the network to a sequence of inputs P
.
Here we define a sequence of targets T
(together P
and T
define the operation of an AND
gate), and then let the network adapt for 10 passes through the sequence. We then simulate the updated network.
Now we define a new problem, an OR
gate, with batch inputs P
and targets T
.
Here we initialize the perceptron (resulting in new random weight and bias values), simulate its output, train for a maximum of 20 epochs, and then simulate it again.
net = init(net); Y = sim(net,P2) net.trainParam.epochs = 20; net = train(net,P2,T2); Y = sim(net,P2)
Notes
Perceptrons can classify linearly separable classes in a finite amount of time. If input vectors have a large variance in their lengths, the learnpn
can be faster than learnp
.
See Also
sim
, init
, adapt
, train
, hardlim
, hardlims
, learnp
, learnpn
, trains
, trainc
newlvq | newpnn |
© 1994-2005 The MathWorks, Inc.