Neural Network Toolbox |
Update NNT 2.0 Elman backpropagation network
Syntax
net = nnt2elm(PR,W1,B1,W2,B2,BTF,BLF,PF)
Description
nnt2elm(PR,W1,B1,W2,B2,BTF,BLF,PF)
takes these arguments,
PR
-- R
x 2
matrix of min and max values for R
input elements
W1
-- S1
x (R+S1)
weight matrix
BTF
-- Backpropagation network training function, default = 'traingdx
'
BLF
-- Backpropagation weight/bias learning function, default = 'learngdm
'
and returns a feed-forward network.
The training function BTF
can be any of the backpropagation training functions such as traingd
, traingdm
, traingda
, and traingdx
. Large step-size algorithms, such as trainlm
, are not recommended for Elman networks.
The learning function BLF
can be either of the backpropagation learning functions such as learngd
or learngdm
.
The performance function can be any of the differentiable performance functions such as mse
or msereg
.
Once a network has been updated, it can be simulated, initialized, adapted, or trained with sim, init
, adapt
, and train.
See Also
nnt2c | nnt2ff |
© 1994-2005 The MathWorks, Inc.