Appendix 3.03 Survey of The Neurosciences
and Music III Conference 2008
Disorders and Plasticity
Title, Category
AimMus. Material, Cultural Ref.Technology & ProcedureMain focus of interestConclusion47S. Boso et al.
(332-335)Musical taste in persons with severe autism(Italy)Cat. 11: Disorder
Cat. 19: Preference
48S. Dellacherie et al.
(336
-341)
Musical emotion and intracranial recordings
Cat. 11: Deficit Cat. 2: Consonance / Dissonance
49S. Drapeau et al.
(342
-345)
Recognition of emotions in Dementia of the Alzheimer Type(DAT)
Cat. 11: Deficit Cat. 19: Emotion
50S. Egermann et al.
(346-
350)
Social feedback on emotion
Cat. 7: Culture Cat. 19. EmotionTo explore whether autistic persons judge pleasant and unpleasant music in an unusual manner Single-case study: Toinvestigate event-relatedpotentials (ERPs) recorded by intracranial Electroencephalography (EEG) in an epilepitc patient
To assess emotional recognition from nonverbalmediums of communication (face, voice, music) in mild
DAT To investigate whether musicinduced emotional effects can be manipulated by socialfeedbackRecorded music:
1) 3 popular songs, often presented on radio and TV
versus environmental sounds
(sea, wind, fire)
2) 3 excerpts of pleasant versus 3 excerpts of unpleasant music
CR: Western
Organ or synthesizer sounds:
48 3-tone dissonant chords,composed of minor and major seconds. 48 consonant sounds: 24 major, 24 minor
triads
CR: Western
Recorded music:
56 novel instrumental clips from the film genre, intended to induce happiness, sadness, fear, and peacefulness
CR: Western, W. popularRecorded music:
23 music excerpts selected to represent four emotional characters: negative or positive valence, low or high arousal
CR: Western11 ASD adults, mean age 27 years. Absence of language.
Controls: 1) 6 subjects.
2) 5 subjects. Mean age 29 years
One 35-year old epilepticpatient, nonmusician. Recording of ERPs in 1) auditory areas, 2) orbitofrontal cortex, 3) amygdalaand anterior cingulate gyrus
7 patients with mild DAT. 16 elderly matched controls. Musical Task: Rate expressed emotion on a 10point scale. Also Facial Expressions Task, and Prosody Task
3315 participants on web. All listened to 5 randomly chosen excerpts.Task: After each excerpt, rate induced emotionsby moving sliderson screen between1) ”un-pleasant” and ”pleasant”.
2)”calming” and ”arousing”Time spent 1) in the familiar music condition versus the environmental sounds condition. 2) In the pleasant music condition versustheunpleasant music condition
ERPs: Differences between consonant and dissonant chords, and between major and minor chords
Preservation of emotional recognition from face, voice, and music
Visitors were randomly directed to 1) control group
or 2) group presented with manipulated social feedback during music listening: displayof information about preceding ratingsBoth groups preferred the musical task and the pleasant music condition. No difference detected.Autistics sharewith healthypeople thepreferences for pleasantmusic
Sequential involvement of brain structuresin implicitemotional judgment of musical dissonance. ERPs:
1) 200 msec. 2)500-1000msec. 3) 1200-1400 msec.Major/minor changes induced ERPs only in orbitofrontal cortex.
In DAT patients, emotional recognition from voice and music was well preserved. Only emotional recognitionfrom the face was impaired
Feedback significantly influenced participants’ ratings in the manipulated direction compared to the group without feedback