4th Annual Meeting of the International Multisensory Research Forum
    Home > Papers > Julien Besle
Julien Besle

Visual information relieves speech processing in the human auditory cortex
Poster

Julien Besle
INSERM - U280, Lyon, France

Alexandra Fort
University Laboratory of Physiology, Oxford, UK

Marie-Hélène Giard
INSERM - U280, Lyon, France

     Abstract ID Number: 77
     Full text: Not available
     Last modified: May 20, 2003

Abstract
Event-related potentials to either auditory (A), visual (V) or bimodal (AV) speech syllables were recorded from 36 scalp electrodes while subjects performed an auditory discrimination task among four different syllables. Crossmodal interactions were estimated using an additive model [AV - (A+V)] in the first 200 ms post-stimulation.
The cumulative probability density function of reaction times violated the race model, showing that the processing of auditory and visual information interacts to speed up bimodal speech perception. Electrophysiological analyses revealed a significant interaction pattern at the latency of the auditory N1 wave. Comparison of SCD topographies and inverse dipole modeling suggested a diminution of activity in the generators of the auditory N1 wave.
Visual speech information may thus relieve the processing of auditory speech information in the auditory cortex. This observation contrasts with the decrease of visual activity (amplitude of the N185 wave) found during audio-visual object recognition, and may be related to the auditory dominance for speech processing compared with general visual dominance for object recognition. The latency of this interaction also corresponds to the stage of phonemic categorization but does not allow us to confirm that integration of visual and auditory cues occurs before this stage, as it has been proposed.


    Learn more
    about this
    publishing
    project...


Public Knowledge

 
Open Access Research
home | overview | program
papers | organization | links
  Top