Music Listening, Music Therapy, Phenomenology and Neuroscience

(Nancy Kaufman) #1

Appendix 3.03 Survey of The Neurosciences


and Music III Conference 2008


Disorders and Plasticity


Title, Category


Aim

Mus. M

aterial, Cultural Ref.

Technology & Procedure

Main focus of interest

Conclusion

47S

. Boso et al.


(332

-335)

Musical taste in persons with severe autism

(Italy)

Cat. 11: Disorder
Cat. 19: Preference
48S

. Dellacherie et al.
(336


-341)
Musical emotion and intracranial recordings
Cat. 11: Deficit Cat. 2: Consonance / Dissonance
49S

. Drapeau et al.
(342


-345)
Recognition of emotions in Dementia of the Alzheimer Type

(DAT)
Cat. 11: Deficit Cat. 19: Emotion
50S

. Egermann et al.
(346-


350)
Social feedback on emotion
Cat. 7: Culture Cat. 19. Emotion

To explore whether autistic persons judge pleasant and unpleasant music in an unusual manner Single

-case study: To

investigate event-

related

potentials (ERPs) recorded by intrac

ranial Electroencepha





lography (EEG) in an epilepitc patient
To assess emotional recog

nition from nonverbal

mediums of communication (face, voice, music) in mild
DAT To investigate whether music





induced emotional effects can be manipulated by socia

l

feedback

Recorded music:
1) 3 popular songs, often presented on radio and TV
versus environmental sounds
(sea, wind, fire)
2) 3 excerpts of pleasant versus 3 excerpts of unpleasant music
CR: Western
Organ or synthesizer sounds:
48 3-

tone dissonant ch

ords,

composed of minor and major seconds. 48 consonant sou

nds: 24 major, 24 minor
triads
CR: Western
Recorded music:
56 novel instrumental clips from the film genre, intended to induce happiness, sadness, fear, and peacefulness
CR: Western

, W. popular

Recorded music:
23 music excerpts selected to represent four emotional characters: negative or positive valence, low or high arousal
CR: Western

11 ASD adults, mean age 27 years. Absence of language.
Controls: 1) 6 subjects.
2) 5 subjects. Mean age 29 years
One 35

-year old epileptic

patient, nonmusician. Recording of ERPs in 1) auditory areas, 2) orbito





frontal cortex, 3) amygdal

a

and anterior cingulate gyrus
7 patients with mild DAT. 16 elderly matched controls. Musical Task: Rate expressed emotion on a 10





point scale. Also Facial Expressions Task, and Prosody Task
3315 participants on web. All listened to 5 randomly chosen excerpts.Task: After each excerpt, rate induced emotions

by moving sliders

on screen between

1) ”un-

pleasant” and ”pleasant”.
2)

”calming” and ”arousing”

Time spent 1) in the familiar music condition versus the environmental sounds condition. 2) In the pleasant music condition versus

the

unpleasant music condition
ERPs: Differences between consonant and dissonant chords, and between major and minor chords
Preservation of emotional recognition from face, voice, and music
Visitors were randomly directed to 1) control group
or 2) group presented with manipulated social feedback during music listening: displa

y

of information about preceding ratings

Both groups preferred the musical task and the pleasant music condition. No difference detected.

Autistics share

with healthy

people the

preferences for plea

sant

music
Sequential involvement of brain structures

in implicit

emotional judgment of musical dissonance. ERPs:
1) 200 msec. 2)

500-

1000

msec. 3) 1200

-1400 msec.

Major/minor changes induced ERPs only in orbitofrontal cortex.
In DAT patients, emotional recognition from voice and music was well preserved. Only emotional recogni

tion

from the face was impaired
Feedback significantly influenced participants’ ratings in the manipulated direction compared to the group without feedback
Free download pdf