| Neural Network Toolbox |    | 
Graph and Symbol
Syntax
Description
softmax is a transfer function. Transfer functions calculate a layer's output from its net input.
softmax(N) takes one input argument,
and returns output vectors with elements between 0 and 1, but with their size relations intact.
softmax('code') returns information about this function.
compet does not have a derivative function.
Examples
Here we define a net input vector N, calculate the output, and plot both with bar graphs.
n = [0; 1; -0.5; 0.5]; a = softmax(n); subplot(2,1,1), bar(n), ylabel('n') subplot(2,1,2), bar(a), ylabel('a')
Network Use
To change a network so that a layer uses softmax, set net.layers{i,j}.transferFcn to 'softmax'.
Call sim to simulate the network with softmax. See newc or newpnn for simulation examples.
See Also
|   | sim | srchbac |  | 
© 1994-2005 The MathWorks, Inc.