7th Annual Meeting of the International Multisensory Research Forum
    Home > Papers > Hanneke Meeren
Hanneke Meeren

INTEGRATION OF FACIAL EXPRESSIONS AND EMOTIONAL VOCALIZATIONS TAKES PLACE IN UNIMODAL VISUAL AREAS
Poster Presentation

Hanneke Meeren
Cognitive and Affective Neuroscience Laboratory, Tilburg University, Tilburg, The Netherlands

Corne van Heijnsbergen
Cognitive and Affective Neuroscience Laboratory, Tilburg University, Tilburg, The Netherlands

Beatrice de Gelder
Cognitive and Affective Neuroscience Laboratory, Tilburg University, Tilburg, The Netherlands

     Abstract ID Number: 60
     Full text: Not available
     Last modified: March 15, 2006
     Presentation date: 06/20/2006 10:00 AM in Hamilton Building, Foyer
     (View Schedule)

Abstract
Emotional signals provided by faces and voices as well as emotional body language provide the primary tools of human emotional communication. How these hang together is not yet well understood. The correspondence between facial and vocal expressions is easily recognized, but the integration of these visual and auditory channels at the neural level is poorly understood. Traditionally it has been assumed that multisensory integration is a higher order process that occurs in multimodal regions after sensory signals have undergone extensive processing through a hierarchy of unisensory cortical regions. Recent findings, however, challenge this assumption and suggest a role for “unimodal” sensory areas. We recorded event-related potentials in 15 subjects, who watched videoclips of angry and happy facial expressions that were accompanied by congruent or incongruent emotional vocalizations. We show that the early visual P1 component is already sensitive for successful audiovisual integration at 107-ms after stimulus onset, i.e. P1 was enhanced when face and vocalization did not match as compared to when they formed a unified emotional percept. Importantly, the effects were not caused by low-level properties of the stimuli, since they were absent for the summated unimodal conditions. Our findings demonstrate that audiovisual integration of dynamic emotional signals already takes place during the early stages of processing in unimodal visual cortex.

Research
Support Tool
  For this 
refereed conference abstract
Capture Cite
View Metadata
Printer Friendly
Context
Author Bio
Define Terms
Related Studies
Media Reports
Google Search
Action
Email Author
Email Others
Add to Portfolio



    Learn more
    about this
    publishing
    project...


Public Knowledge

 
Open Access Research
home | overview | program
papers | organization | schedule | links
  Top