Neural Network Toolbox Previous page   Next Page
learnhd

Hebb with decay weight learning rule

Syntax

[dW,LS] = learnhd(W,P,Z,N,A,T,E,gW,gA,D,LP,LS)

info = learnhd(code)

Description

learnhd is the Hebb weight learning function.

learnhd(W,P,Z,N,A,T,E,gW,gA,D,LP,LS) takes several inputs,

and returns,

Learning occurs according to learnhd's learning parameters shown here with default values.

learnhd(code) returns useful information for each code string:

Examples

Here we define a random input P, output A, and weights W for a layer with a two-element input and three neurons. We also define the decay and learning rates.

Since learnhd only needs these values to calculate a weight change (see algorithm below), we will use them to do so.

Network Use

To prepare the weights and the bias of layer i of a custom network to learn with learnhd

  1. Set net.trainFcn to 'trainr'. (net.trainParam will automatically become trainr's default parameters.)
  2. Set net.adaptFcn to 'trains'. (net.adaptParam will automatically become trains's default parameters.)
  3. Set each net.inputWeights{i,j}.learnFcn to 'learnhd'. Set each net.layerWeights{i,j}.learnFcn to 'learnhd'. (Each weight learning parameter property will automatically be set to learnhd's default parameters.)

To train the network (or enable it to adapt)

  1. Set net.trainParam (net.adaptParam) properties to desired values.
  2. Call train (adapt).

Algorithm

learnhd calculates the weight change dW for a given neuron from the neuron's input P, output A, decay rate DR, and learning rate LR according to the Hebb with decay learning rule:

See Also


Previous page  learnh learnis Next page

© 1994-2005 The MathWorks, Inc.