English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Visual and Auditory Information Specifying an Impending Collision of an Approaching Object

MPS-Authors
There are no MPG-Authors in the publication available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Zhou, L., Yan, J., Liu, Q., Li, H., Xie, C., Wang, Y., et al. (2007). Visual and Auditory Information Specifying an Impending Collision of an Approaching Object. In J. Jacko (Ed.), Human-Computer Interaction. Interaction Platforms and Techniques 12th International Conference, HCI International 2007, Beijing, China, July 22-27, 2007 (pp. 720-729). Berlin, Germany: Springer.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-CC4B-E
Abstract
Information about the impending collision of an approaching object can be specified by visual and auditory means. We examined the discrimination thresholds for vision, audition, and vision/audition combined, in the processing of time-to-collision (TTC) of an approaching object. The stimulus consisted of a computer simulated car approaching on a flat ground towards the participants which disappeared at a certain point before collision. After the presentation of two approaching movements in succession, participants pressed a button to indicate which of the two movements would result in the car colliding with the viewpoint sooner from the moment it disappeared. The results demonstrated that most participants were sensitive to TTC information provided by a visual source, but not when provided by an auditory source. That said, auditory information provided effective static distance information. When both sources of information were combined, participants used the most accurate source of information to make their judgments.