The Cognitive Neuroscience of Music

(Brent) #1

For the tempo, pattern, and meter discriminations, there was a tendency for activation
to be detected in nonmusicians that was not present for musicians. For tempo discrimina-
tion, nonmusicians showed activation in right medial frontal cortex (BA 9) and bilateral
posterior lateral cerebellum, but this activation was very much weaker or absent in musi-
cians (Figures 17.8 and 17.12). For pattern discrimination, nonmusicians showed activa-
tion in midbrain and bilateral posterior lateral cerebellum; however, this activation was
absent and weaker in musicians (Figures 17.9 and 17.12). For meter discrimination, non-
musicians showed activation in right inferior frontal cortex (BA 47) and bilateral posterior
lateral cerebellum (Figure 17.10 and 17.12), with little or no such activation in these areas
for musicians. By contrast, during duration discrimination musicians showed bilateral
medial frontal cortex (BA 9), bilateral inferior parietal cortex (BA 40), and strong activa-
tion in bilateral posterior lateral cerebellum (Figures 17.11 and 17.12). These activations
were absent or very much weaker in nonmusicians.
As with pitch discrimination, the strong cerebellar activations during rhythm discrimi-
nations are not likely due to motoric processing per sebut to a role in supporting non-
motor sensory or cognitive processing.38–45In addition, in the meter, tempo, and pattern
discriminations, cerebellar activity is much stronger for nonmusicians than for musicians,
but the reverse is true for the duration discrimination. Interestingly, musicians found dis-
criminating the duration of rhythmic sequences more novel than discriminating meter,
tempo, and pattern. By contrast, the nonmusicians found discriminating the overall dura-
tion of rhythmic sequences less novel than discriminating other rhythm features. Thus, the


  257

Figure 17.7 Activations in superior and middle temporal cortex (A, B, D, and E) and in cerebellum (C and F) as
nonmusicians and musicians discriminate pitch sequences.^49 These images show group mean PET data for each
task (contrasted with rest) overlaid on MRIs. PET data are z-scores displayed on a colour scale ranging from 1.96
(yellow;p0.1) to 4.0 (red;p0.0001). (See Plate 11 in colour section.)

Free download pdf