Neural Network Toolbox Previous page   Next Page

Creating an Elman Network (newelm)

An Elman network with two or more layers can be created with the function newelm. The hidden layers commonly have tansig transfer functions, so that is the default for newelm. As shown in the architecture diagram, purelin is commonly the output-layer transfer function.

The default backpropagation training function is trainbfg. One might use trainlm, but it tends to proceed so rapidly that it does not necessarily do well in the Elman network. The backprop weight/bias learning function default is learngdm, and the default performance function is mse.

When the network is created, each layer's weights and biases are initialized with the Nguyen-Widrow layer initialization method implemented in the function initnw.

Now consider an example. Suppose that we have a sequence of single-element input vectors in the range from 0 to 1. Suppose further that we want to have five hidden-layer tansig neurons and a single logsig output layer. The following code creates the desired network.

Simulation

Suppose that we want to find the response of this network to an input sequence of eight digits that are either 0 or 1.

Recall that a sequence to be presented to a network is to be in cell array form. We can convert P to this form with

Now we can find the output of the network with the function sim.

We convert this back to concurrent form with

and can finally display the output in concurrent form with

Thus, once the network is created and the input specified, one need only call sim.


Previous page  Elman Networks Training an Elman Network Next page

© 1994-2005 The MathWorks, Inc.