Neural Network Toolbox |
Sequential order incremental training w/learning functions
Syntax
[net,TR,Ac,El] = trains(net,Pd,Tl,Ai,Q,TS,VV,TV)
Description
trains
is not called directly. Instead it is called by train
for networks whose net.trainFcn
property is set to 'trains'
.
trains
trains a network with weight and bias learning rules with sequential updates. The sequence of inputs is presented to the network with updates occurring after each time step.
This incremental training algorithm is commonly used for adaptive applications.
and after training the network with its weight and bias learning functions returns:
Training occurs according to trains
's training parameter shown here with its default value:
Dimensions for these variables are
Pd
-- No x NixTS
cell array, each element P{i,j,ts}
is a Zij x Q
matrix
Tl
-- Nl x TS
cell array, each element P{i,ts}
is an Vi x Q
matrix or []
Ai
-- Nl x LD
cell array, each element Ai{i,k}
is an Si x Q
matrix
Ac
-- Nl x (LD+TS)
cell array, each element Ac{i,k}
is an Si x Q
matrix
El
-- Nl x TS
cell array, each element El{i,k}
is an Si x Q
matrix or []
trains
(code
) returns useful information for each code
string:
Network Use
You can create a standard network that uses trains
for adapting by calling newp
or newlin
.
To prepare a custom network to adapt with trains
net.adaptFcn
to 'trains
'.
net.inputWeights{i,j}.learnFcn
to a learning function.
net.layerWeights{i,j
}.learnFcn to a learning function.
net.biases{i}.learnFcn
to a learning function. (Weight and bias learning parameters will automatically be set to default values for the given learning function.)
See newp
and newlin
for adaption examples.
Algorithm
Each weight and bias is updated according to its learning function after each time step in the input sequence.
See Also
newp, newlin, train, trainb, trainc, trainr
trainrp | trainscg |
© 1994-2005 The MathWorks, Inc.