The Cognitive Neuroscience of Music

(Brent) #1

for music or for any other cognitive function of interest and then compare the results of
these metanalyses. Although such metanalyses are being performed for some aspects of
language processing, such as language production^53 or prelexical and lexical processes in
language comprehension,^54 data in the neuroimaging of music are still too scarce for such
an enterprise. Moreover, assuming that such metanalyses are performed for music as well,
it still remains extremely difficult to compare results of experiments that were not directly
designed to compare language and music processing. Indeed, leaving aside the theoretical
problem of which level of processing in language is best compared with which level of pro-
cessing in music, the choice of the task to be performed on the stimuli, its difficulty, as well
as experimental factors, such as the mode (blocked vs mixed) and rate of stimulus presen-
tation, stimulus repetition, and data analysis (e.g. subtraction method, correlative analy-
ses), have been shown to exert a predominant influence on the results obtained. With these
remarks in mind, it is nevertheless interesting to mention some of the results found for lan-
guage and music to determine the extent to which the brain structures that are activated
are similar or different.
Few experiments have been designed to directly compare language and music using
brain imaging methods. Binder et al.^55 compared tones and word processing in an fMRI
study. Results showed that several brain structures, including the left superior temporal sul-
cus, middle temporal gyrus, angular gyrus, and lateral frontal lobe, showed stronger acti-
vation for words than tones. However, both types of stimuli activated the Heschl gyrus and
the superior temporal plane, including the planum temporale. The investigators concluded
that whereas the planum temporale is similarly involved in the auditory processing of
words and tones, other broadly distributed areas are specifically involved in word process-
ing. Gandour et al.^56 conducted a PET study in which both Thai and English participants
were required to discriminate pitch patterns and Thai lexical tones derived from accurately
filtered Thai words. Results of the tone minus pitch subtraction indicated that only native
Thai speakers showed activation of the left frontal operculum (BA 44/45). This finding was
taken as evidence that Thai lexical tones are meaningful for native Thai speakers but not
for English speakers. However, for our purposes, it is also interesting that for both Thai and
English speakers, several structures, including the left anterior cingulated gyrus (BA 32),
the left and right superior temporal gyrus (BA 22), and the right cerebellum, were activated
in both pitch and tone tasks.
More generally, results have shown that primary auditory regions (BA 41 and BA 42)
respond in similar ways to speech and music.^57 Secondary auditory regions (BA 22) are
activated by hearing and understanding words^58 as well as by listening to scales,^59 auditory
imagery for sounds,^60 and access to melodic representations.^61 The supramarginal gyrus
(BA 40) seems involved in understanding the symbolism of language^58 and the reading of
musical scores.^59 Broca’s area is known to be involved in motor activity related to language
and was also shown to be active when playing music^59 and when musicians were engaged
in a rhythmic task.^61 The supplementary motor areas (BA 6) and the right cerebellum are
also active when playing and imaging playing music.59,62Although this list is far from
exhaustive, it nevertheless suffices to show that some of the most important language areas
are clearly involved in music processing as well. Some brain structures also seem to be
specifically or preferentially involved in language processing,^63 and the converse is true for


     277
Free download pdf