4th Annual Meeting of the International Multisensory Research Forum
    Home > Papers > Thomas James
Thomas James

Human MT+ and the resolution of visual structure from motion by tactile perception
Poster

Thomas James
Vanderbilt Vision Research Center, Vanderbilt University

Kenith Sobel
Vanderbilt Vision Research Center, Vanderbilt University

Randolph Blake
Vanderbilt Vision Research Center, Vanderbilt University

     Abstract ID Number: 81
     Full text: Not available
     Last modified: May 20, 2003

Abstract
Although vision is the dominant sensory modality for humans and other primates, vision and touch often interact during the perception of 3-D structure. Using psychophysics and fMRI, we show that unambiguous tactile input partially resolves ambiguous shape from visual motion, with this neural integration possibly arising within the middle temporal area (MT). The ambiguous shape was a rotating visual globe (VG) defined solely by moving dots. In the absence of depth cues, direction of rotation (CW or CCW) was ambiguous and bistable. The tactile stimulus was a styrofoam globe (TG) covered with tiny bumps to simulate dots. Tracking of the two bistable rotational directions of the VG was strongly (but not completely) biased by the rotation of the TG. Intervals with consistent VG/TG rotation were longer relative to control and inconsistent intervals were shorter. The VG was not influenced when TG exposure preceded VG onset (adaptation durations of 5 and 60 s). Both the VG and the TG (with eyes closed) engaged area MT+ reliably, but signal change was much lower with the TG. Conclusion: Tactile input helps disambiguate the structure of moving visual patterns. Putative visual area MT+ may integrate both visual and tactile inputs.


    Learn more
    about this
    publishing
    project...


Public Knowledge

 
Open Access Research
home | overview | program
papers | organization | links
  Top