Neural Network Toolbox Previous page   Next Page
srchbac

One-dimensional minimization using backtracking

Syntax

[a,gX,perf,retcode,delta,tol] = srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf)

Description

srchbac is a linear search routine. It searches in a given direction to locate the minimum of the performance function in that direction. It uses a technique called backtracking.

srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf) takes these inputs,

and returns,

Parameters used for the backstepping algorithm are:

The defaults for these parameters are set in the training function that calls it. See traincgf, traincgb, traincgp, trainbfg, trainoss.

Dimensions for these variables are:

where

Examples

Here is a problem consisting of inputs p and targets t that we would like to solve with a network.

Here a two-layer feed-forward network is created. The network's input ranges from [0 to 10]. The first layer has two tansig neurons, and the second layer has one logsig neuron. The traincgf network training function and the srchbac search function are to be used.

Create and Test a Network

Network Use

You can create a standard network that uses srchbac with newff, newcf, or newelm.

To prepare a custom network to be trained with traincgf, using the line search function srchbac

  1. Set net.trainFcn to 'traincgf'. This will set net.trainParam to traincgf's default parameters.
  2. Set net.trainParam.searchFcn to 'srchbac'.

The srchbac function can be used with any of the following training functions: traincgf, traincgb, traincgp, trainbfg, trainoss.

Algorithm

srchbac locates the minimum of the performance function in the search direction dX, using the backtracking algorithm described on page 126 and 328 of Dennis and Schnabel's book noted below.

See Also

srchcha, srchgol, srchhyb

References

Dennis, J. E., and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Englewood Cliffs, NJ: Prentice-Hall, 1983.



Previous page  softmax srchbre Next page

© 1994-2005 The MathWorks, Inc.