Neural Network Toolbox |
Introduction
Radial basis networks may require more neurons than standard feedforward backpropagation networks, but often they can be designed in a fraction of the time it takes to train standard feedforward networks. They work best when many training vectors are available.
You may want to consult the following paper on this subject:
Chen, S., C.F.N. Cowan, and P. M. Grant, "Orthogonal Least Squares Learning Algorithm for Radial Basis Function Networks," IEEE Transactions on Neural Networks, vol. 2, no. 2, March 1991, pp. 302-309.
This chapter discusses two variants of radial basis networks, Generalized Regression networks (GRNN) and Probabilistic neural networks (PNN). You may want to read about them in P.D. Wasserman, Advanced Methods in Neural Computing, New York: Van Nostrand Reinhold, 1993, on pp. 155-61 and pp. 35-55 respectively.
Radial Basis Networks | Important Radial Basis Functions |
© 1994-2005 The MathWorks, Inc.