Neural Network Toolbox Previous page   Next Page
learngd

Gradient descent weight and bias learning function

Syntax

[dW,LS] = learngd(W,P,Z,N,A,T,E,gW,gA,D,LP,LS)

[db,LS] = learngd(b,ones(1,Q),Z,N,A,T,E,gW,gA,D,LP,LS)

info = learngd(code)

Description

learngd is the gradient descent weight and bias learning function.

learngd(W,P,Z,N,A,T,E,gW,gA,D,LP,LS) takes several inputs,

and returns,

Learning occurs according to learngd's learning parameter shown here with its default value.

learngd(code) returns useful information for each code string:

Examples

Here we define a random gradient gW for a weight going to a layer with 3 neurons, from an input with 2 elements. We also define a learning rate of 0.5.

Since learngd only needs these values to calculate a weight change (see algorithm below), we will use them to do so.

Network Use

You can create a standard network that uses learngd with newff, newcf, or newelm. To prepare the weights and the bias of layer i of a custom network to adapt with learngd

  1. Set net.adaptFcn to 'trains'. net.adaptParam will automatically become trains's default parameters.
  2. Set each net.inputWeights{i,j}.learnFcn to 'learngd'. Set each net.layerWeights{i,j}.learnFcn to 'learngd'. Set net.biases{i}.learnFcn to 'learngd'. Each weight and bias learning parameter property will automatically be set to learngd's default parameters.

To allow the network to adapt

  1. Set net.adaptParam properties to desired values.
  2. Call adapt with the network.

See newff or newcf for examples.

Algorithm

learngd calculates the weight change dW for a given neuron from the neuron's input P and error E, and the weight (or bias) learning rate LR, according to the gradient descent: dw = lr*gW.

See Also

learngdm, newff, newcf, adapt, train


Previous page  learncon learngdm Next page

© 1994-2005 The MathWorks, Inc.