7th Annual Meeting of the International Multisensory Research Forum
    Home > Papers > Laurence Harris
Laurence Harris

Visual and auditory cues for localization combine in a statistically optimal way.
Single Paper Presentation

Laurence Harris
Psychology, York University

Bahar Salavati
Psychology, York University

Phil Jaekl
Psychology, York University

     Abstract ID Number: 189
     Full text: Not available
     Last modified: March 19, 2006
     Presentation date: 06/19/2006 10:00 AM in Hamilton Building, Foyer
     (View Schedule)

Abstract
To assess how different senses contribute to our ability to localize events we presented two stimuli separated by 0.5s and asked subjects to judge whether the first stimulus was above or below the second while the distance between them was varied by the method of constant stimuli. Stimuli were lights, sounds or bimodal stimuli. Lights were made more difficult to localize by smearing them with a Gaussian distribution, sounds were made more difficult by asking them to be localized vertically. Each comparison was subject to sources of noise that could either be added together if the events were unrelated, such as the first and second stimulus, or in a statistically optimal way if both contributed to the estimate. The noise when bimodal stimuli were involved in the comparison was accurately predicted by a statistically optimal combination of visual and auditory information implying a multimodal localization mechanism.

Research
Support Tool
  For this 
refereed conference abstract
Capture Cite
View Metadata
Printer Friendly
Context
Author Bio
Define Terms
Related Studies
Media Reports
Google Search
Action
Email Author
Email Others
Add to Portfolio



    Learn more
    about this
    publishing
    project...


Public Knowledge

 
Open Access Research
home | overview | program
papers | organization | schedule | links
  Top