English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Object feature validation using visual and haptic similarity ratings

MPS-Authors
/persons/resource/persons83865

Cooke,  T
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84001

Kannengiesser,  S
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  H
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Cooke, T., Kannengiesser, S., Wallraven, C., & Bülthoff, H. (2006). Object feature validation using visual and haptic similarity ratings. ACM Transactions on Applied Perception, 3(3), 239-261. doi:10.1145/1166087.1166093.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-CFC3-A
Abstract
The perceived similarity between objects may well vary according to the sensory modality/modalities in which they are experienced, an important consideration for the design of multimodal interfaces. In this study, we present a similarity-based method for comparing the perceptual importance of object properties in touch and in vision and show how the method can also be used to validate computational measures of object properties. Using either vision or touch, human subjects judged the similarity between novel, 3D objects which varied parametrically in shape and texture. Similarities were also computed using a set of state-of-the art 2D and 3D computational measures. Two resolutions of 2D and 3D object data were used for these computations in order to test for scale dependencies. Multidimensional scaling (MDS) was then performed on all similarity data, yielding maps of the stimuli in both perceptual and computational spaces, as well as the relative weight of shape and texture dimensions. For this object set, we found that visual subjects accorded more importance to shape than texture, while haptic subjects weighted them roughly evenly. Fit errors between human and computational maps were then calculated to assess each feature's perceptual validity. Shape-biased features provided good overall fits to the human visual data; however, no single feature yielded a good overall fit to the haptic data, in which we observed large individual differences. This work demonstrates how MDS techniques can be used to evaluate computational object features using the criterion of perceptual similarity. It also demonstrates a way of assessing how the perceptual validity of a feature varies as a function of parameters such as the target modality and the resolution of object data. Potential applications of this method for the design of unimodal and multimodal human---machine interfaces are discussed.