Audiovisual interactions in detection of motion-in-depth

Matthew Ballesteros, School of Psychology, University of Sydney

Abstract
The aim of the current study was to evaluate how auditory and visual motion signals interact in the detection of motion in depth. Using a two-interval 2AFC paradigm, detection thresholds were measured for motion in depth. This was done separately for looming and receding motions, in both vision and audition. A range of trajectories were used from 35° left to 35° right of the observer’s midline, and all radiated from the head of the observer. In both audition and vision, there was a fixed level of background noise and signal amplitude was varied to find detection threshold. Auditory motion signals were presented using virtual auditory space; visual motion signals were projected from above onto a flat surface and viewed at a shallow incident angle so that trajectories were at head level. In bimodal conditions, auditory and visual signals were combined for a given trajectory and ‘same’ vs. ‘opposite’ directions were compared. There was no evidence of a superadditive interaction between spatiotemporally coincident audiovisual motions. The results are therefore discussed in relation to probability summation, linear summation and maximum likelihood models. There was no evidence of heightened motion sensitivity for looming motions relative to receding motions.

Not available

Back to Abstract