A First Course in FUZZY and NEURAL CONTROL

(singke) #1
5.5. THE BACKPROPAGATION ALGORITHM 181

We will write down the updating formulas for weights in a two-layer neural
network. The generalization to more than two layers is just a matter of notation.
Letvjibe the weight of the link connecting the hidden neuronito the output
neuronj,andwikthe weight of the link connecting the input nodekto the
hidden neuroni.
First,theupdatingruleforthevjiís is the same as in the delta rule, by
viewing now the hidden layer as an input layer. For output neuronjand hidden
neuroni,theweightvjiis updated using


4 vji =

XN

j=1

4 qvji

= −η

XN

j=1

∂Eq
∂vji

=

XN

j=1

°

−ηδqjzqi

¢

whereziqis the net input to the hidden neuroni,


ziq=fi

√n
X

k=0

wikxqk

!

and


δqj=

°

oqj−yjq

¢

fi^0

√m
X

k=1

vjkzkq

!

For a hidden neuroniand input nodek,theweightwikis updated as follows:


4 qwik=−η
∂Eq
∂wik

with
∂Eq
∂wik


=

∂Eq
∂oqi

∂oqi
∂wik

whereoqiis the output of the hidden neuroni,


oqi=fi

√n
X

`=0

wi`xq`

!

=ziq

We have
∂oqi
∂wik


=fi^0

√n
X

`=0

wi`xq`

!

xqk

Letδqi=∂E


q
∂oqi, the hidden layer. How do we compute this? Observe that

δqi=

∂Eq
∂oqi

=

Xp

j=1

∂Eq
∂oqj

∂oqj
∂oqi
Free download pdf