Neural Network Toolbox |
One-dimensional minimization using golden section search
Syntax
[a,gX,perf,retcode,delta,tol] = srchgol(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,tol,ch_perf)
Description
srchgol
is a linear search routine. It searches in a given direction to locate the minimum of the performance function in that direction. It uses a technique called the golden section search.
srchgol(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,tol,ch_perf)
takes these inputs,
X
-- Vector containing current values of weights and biases
Ai
-- Initial input delay conditions
perf
-- Performance value at current X
dperf
-- Slope of performance value at current X
in direction of dX
a
-- Step size, which minimizes performance
gX
-- Gradient at new minimum point
perf
-- Performance value at new minimum point
retcode
-- Return code, which has three elements. The first two elements correspond to the number of function evaluations in the two stages of the search. The third element is a return code. These will have different meanings for different search algorithms. Some may not be used in this function.
delta
-- New initial step size. Based on the current step size.
Parameters used for the golden section algorithm are:
alpha
-- Scale factor, which determines sufficient reduction in perf
scale_tol
-- Parameter, which relates the tolerance tol
to the initial step size delta
. Usually set to 20
The defaults for these parameters are set in the training function that calls it. See traincgf
,
traincgb
,
traincgp
,
trainbfg
,
trainoss
.
Dimensions for these variables are:
Pd
-- No
x Ni
x TS
cell array, each element P{i,j,ts}
is a Dij
x Q
matrix
Tl
-- Nl
x TS
cell array, each element P{i,ts}
is an Vi
x Q
matrix
Ai
-- Nl
x LD
cell array, each element Ai{i,k}
is an Si
x Q
matrix
Examples
Here is a problem consisting of inputs p
and targets t
that we would like to solve with a network.
Here a two-layer feed-forward network is created. The network's input ranges from [0 to 10]
. The first layer has two tansig neurons, and the second layer has one logsig neuron. The traincgf
network training function and the srchgol
search function are to be used.
net = newff([0 5],[2 1],{'tansig','logsig'},'traincgf');
a = sim(net,p)
net.trainParam.searchFcn = 'srchgol
';
net.trainParam.epochs = 50;
net.trainParam.show = 10;
net.trainParam.goal = 0.1;
net = train(net,p,t);
a = sim(net,p)
Network Use
You can create a standard network that uses srchgol
with newff
, newcf
, or newelm
.
To prepare a custom network to be trained with traincgf
, using the line search function srchgol
net.trainFcn
to 'traincgf
'. This will set net.trainParam
to traincgf
's default parameters.
net.trainParam.searchFcn
to 'srchgol
'.
The srchgol
function can be used with any of the following training functions: traincgf
, traincgb
, traincgp
, trainbfg
, trainoss
.
Algorithm
srchgol
locates the minimum of the performance function in the search direction dX
, using the golden section search. It is based on the algorithm as described on page 33 of Scales (see reference below).
See Also
srchbac
,
srchbre
,
srchcha
,
srchhyb
References
Scales, L. E., Introduction to Non-Linear Optimization, New York: Springer-Verlag, 1985.
srchcha | srchhyb |
© 1994-2005 The MathWorks, Inc.