Associative Learning in Multisensory Integration

Helen Bates, Psychology, Trinity College Dublin, Ireland

Abstract
A number of previous investigations have shown that perceptual learning can occur for a wide range of visual tasks including orientation discrimination, motion direction, spatial position detection and object recognition (review see Fine and Jacobs, 2002: Journal of Vision, 2(2),190-203). The present study investigates whether associative learning plays a key role in perception of bistable (Ternus) patterns when auditory information is integrated with the visual display. Our approach is situated within a Bayesian framework of perception, involving model-based matching of sensory data to stored “priors”/ data-independent knowledge. Associative learning mechanisms are hypothesized to provide perceptual systems with “prior” estimates of signal/modality reliability, which are used to guide optimal multimodal bindings. Following repeated presentation visual information (“Ternus” displays) with spatiotemporally coincident auditory information (high/low frequency tones) our data suggest that increasing probability of association between multimodal sources of information has a systematic effect on perceptual grouping phenomenon.

Not available

Back to Abstract