Neural Network Toolbox Previous page   Next Page

Prediction Example

Suppose that we want to use an adaptive filter to predict the next value of a stationary random process, p(t). We use the network shown below to do this.

The signal to be predicted, p(t), enters from the left into a tapped delay line. The previous two values of p(t) are available as outputs from the tapped delay line. The network uses adapt to change the weights on each time step so as to minimize the error e(t) on the far right. If this error is zero, then the network output a(t) is exactly equal to p(t), and the network has done its prediction properly.

A detailed analysis of this network is not appropriate here, but we can state the main points. Given the autocorrelation function of the stationary random process p(t), the error surface, the maximum learning rate, and the optimum values of the weights can be calculated. Commonly, of course, one does not have detailed information about the random process, so these calculations cannot be performed. But this lack does not matter to the network. The network, once initialized and operating, adapts at each time step to minimize the error and in a relatively short time is able to predict the input p(t).

Chapter 10 of [HDB96] presents this problem, goes through the analysis, and shows the weight trajectory during training. The network finds the optimum weights on its own without any difficulty whatsoever.

You also can try demonstration program nnd10nc to see an adaptive noise cancellation program example in action. This demonstration allows you to pick a learning rate and momentum (see Backpropagation), and shows the learning trajectory, and the original and cancellation signals verses time.


Previous page  Adaptive Filter Example Noise Cancellation Example Next page

© 1994-2005 The MathWorks, Inc.