Foundations of Cognitive Psychology: Preface - Preface

(Steven Felgate) #1

wherete¼1ifeventeoccurs, 0 otherwise. This has the effect of changing the
weight more radically when the unit into which it feeds is uncommitted (in the
middle of its activation range). Applying the delta rule to change the weights
from the hidden units to the expectation units:


Dwhe¼eahde;

whereahis the activation of hidden unithandwheis the weight on the link
from hidden unithto expectation unite.
The delta rule offers no guidance on how to change the weights from the in-
put units to the hidden units, because the error signal on the hidden units
is undefined. The solution, commonly known asbackpropagation(Rumelhart,
Hinton, & Williams, 1986), has dramatically broadened the scope of neural net
models in recent years. In the context of the present model, each hidden unit
inherits the error of each expectation unit connected to it, weighted by the link
between them, and sums these weighted errors. This is again scaled by the
slope of the logistic function at the hidden unit’s current activation level. Thus
the error signal on hidden uniteis:


dh¼

X


e

whede

!


dah
dneth

:


The delta rule is then applied to change the weights on the links from the input
units to the hidden units:


Dwih¼eaidh:
This model was used to learn sequences of chord functions, using a temporal
composite of invariant pitch chord function for input and expectation (Bhar-
ucha & Todd, 1989). Figure 19.8 shows six units, representing, in a major key,
the tonic, supertonic, mediant, subdominant, dominant, and submediant. Fifty
sequences, of seven successive chords each, were generated at random using
a priori transition probabilities estimated from Piston’s (1978, p. 21) table of


Figure 19.9
Logistic function relating the net input and activation of a unit.


Neural Nets, Temporal Composites, and Tonality 473
Free download pdf