5.5. THE BACKPROPAGATION ALGORITHM 181
We will write down the updating formulas for weights in a two-layer neural
network. The generalization to more than two layers is just a matter of notation.
Letvjibe the weight of the link connecting the hidden neuronito the output
neuronj,andwikthe weight of the link connecting the input nodekto the
hidden neuroni.
First,theupdatingruleforthevjiís is the same as in the delta rule, by
viewing now the hidden layer as an input layer. For output neuronjand hidden
neuroni,theweightvjiis updated using
4 vji =XN
j=14 qvji= −ηXN
j=1∂Eq
∂vji=
XN
j=1°
−ηδqjzqi¢
whereziqis the net input to the hidden neuroni,
ziq=fi√n
Xk=0wikxqk!
and
δqj=°
oqj−yjq¢
fi^0√m
Xk=1vjkzkq!
For a hidden neuroniand input nodek,theweightwikis updated as follows:
4 qwik=−η
∂Eq
∂wikwith
∂Eq
∂wik
=
∂Eq
∂oqi∂oqi
∂wikwhereoqiis the output of the hidden neuroni,
oqi=fi√n
X`=0wi`xq`!
=ziqWe have
∂oqi
∂wik
=fi^0√n
X`=0wi`xq`!
xqkLetδqi=∂E
q
∂oqi, the hidden layer. How do we compute this? Observe thatδqi=∂Eq
∂oqi=
Xpj=1∂Eq
∂oqj∂oqj
∂oqi