Neural Network Toolbox |
Calculate Jacobian performance vector
Syntax
[je,jj,normje] = calcjejj(net,Pd,BZ,IWZ,LWZ,N,Ac,El,Q,TS,MR)
Description
This function calculates two values (related to the Jacobian of a network) required to calculate the network's Hessian, in a memory
efficient way.
Two values needed to calculate the Hessian of a network are J*E (Jacobian times errors) and J'J (Jacobian squared). However the Jacobian J can take up a lot of memory. This function calculates J*E and J'J by dividing up training vectors into groups, calculating partial Jacobians Ji and its associated values Ji*Ei and Ji'Ji, then summing the partial values into the full J*E and J'J values.
This allows the J*E and J'J values to be calculated with a series of smaller Ji matrices, instead of a larger J matrix.
[je,jj,normgX] = calcjejj(net,PD,BZ,IWZ,LWZ,N,Ac,El,Q,TS,MR)
takes,
Examples
Here we create a linear network with a single input element ranging from 0 to 1, two neurons, and a tap delay on the input with taps at zero, two, and four time steps. The network is also given a recurrent connection from layer 1 to itself with tap delays of [1 2].
Here is a single (Q = 1
) input sequence P
with five time steps (TS = 5
), and the four initial input delay conditions Pi
, combined inputs Pc
, and delayed inputs Pd
.
Here the two initial layer delay conditions for each of the two neurons, and the layer targets for the two neurons over five time steps are defined.
Here the network's weight and bias values are extracted, and the network's performance and other signals are calculated.
Finally we can use calcgx
to calculate the Jacobian times error, Jacobian squared, and the norm of the Jocobian times error using a memory reduction of 2.
The results should be the same whatever the memory reduction used. Here a memory reduction of 3 is used.
See Also
calcgx | calcjx |
© 1994-2005 The MathWorks, Inc.