Neural Network Toolbox |
Principal Component Analysis (prepca, trapca)
In some situations, the dimension of the input vector is large, but the components of the vectors are highly correlated (redundant). It is useful in this situation to reduce the dimension of the input vectors. An effective procedure for performing this operation is principal component analysis. This technique has three effects: it orthogonalizes the components of the input vectors (so that they are uncorrelated with each other); it orders the resulting orthogonal components (principal components) so that those with the largest variation come first; and it eliminates those components that contribute the least to the variation in the data set. The following code illustrates the use of prepca
, which performs a principal component analysis.
Note that we first normalize the input vectors, using prestd
, so that they have zero mean and unity variance. This is a standard procedure when using principal components. In this example, the second argument passed to prepca
is 0.02. This means that prepca
eliminates those principal components that contribute less than 2% to the total variation in the data set. The matrix ptrans
contains the transformed input vectors. The matrix transMat
contains the principal component transformation matrix. After the network has been trained, this matrix should be used to transform any future inputs that are applied to the network. It effectively becomes a part of the network, just like the network weights and biases. If you multiply the normalized input vectors pn
by the transformation matrix transMat
, you obtain the transformed input vectors ptrans
.
If prepca
is used to preprocess the training set data, then whenever the trained network is used with new inputs they should be preprocessed with the transformation matrix that was computed for the training set. This can be accomplished with the routine trapca
. In the following code, we apply a new set of inputs to a network we have already trained.
Mean and Stand. Dev. (prestd, poststd, trastd) | Post-Training Analysis (postreg) |
© 1994-2005 The MathWorks, Inc.