Integration of auditory and visual communication information in the primate prefrontal cortex.

Lizabeth Romanski, University of Rochester

Abstract
Social communication relies on the integration of auditory and visual information. To understand how these two modalities are integrated during social communication we recorded from the primate ventrolateral prefrontal cortex (VLPFC) during the simultaneous and separate presentation of vocalizations and corresponding facial gestures. The stimuli consisted of short video clips of conspecific macaques vocalizing which were deconstructed into audio and visual components and presented separately and simultaneously. The single units we encountered responded robustly to auditory, visual and combined face-vocalization stimuli. The multimodal neurons represented one-third of the task responsive population and exhibited significantly enhanced or suppressed responses to bimodal stimuli. Combination of face or movie stimuli with incongruent vocalizations resulted in significant changes in neuronal firing compared to congruent stimuli. Moreover, alterations of the temporal onset of auditory-visual stimuli also resulted in a significant change when the onset of the auditory stimulus preceded the visual motion stimulus. Our results suggest that VLPFC neurons in the rhesus monkey integrate communication-relevant auditory and visual information. Analysis of multimodal processing in the VLPFC of non-human primates may help us to understand social communication in the human brain, in which the integration of multimodal sensory information is crucial.

Not available

Back to Abstract