Visual-haptic integration during tool use
Chie Takahashi, Simon J Watt
Last modified: 2010-05-03
Abstract
To integrate signals from vision and haptics appropriately the brain must solve a “correspondence problem” so that it only combines information referring to the same object. Bayesian cue-integration models suggest this could be achieved by considering the similarity of the signals in terms of their spatial coincidence and magnitude (Ernst, 2007; Körding et al., 2007). Tools complicate this, however, because they systematically change the relationships between the locations of the visual object and hand, and the size of the visual object and the hand opening. We have shown previously that the brain integrates visual and haptic information appropriately during tool use, suggesting that the correspondence problem is not solved on ‘raw’ haptic signals, but instead the haptic estimate is remapped, taking into account tool geometry. Here we tested this directly, by measuring perceived size from haptics while using pliers that (i) preserved the normal 1:1 relationship between visual and haptic signals, (ii) minified the hand opening, and (iii) magnified it. We found that perceived size from haptics was rescaled according to the type of tool. This suggests that the brain dynamically scales haptic estimates during tool use, allowing the visual-haptic correspondence problem to be solved in common ‘object coordinates’.