English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Boundary extension as a function of viewpoint in a virtual scene

MPS-Authors
/persons/resource/persons84258

Thornton,  IM
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Christou, C., & Thornton, I. (2002). Boundary extension as a function of viewpoint in a virtual scene. Poster presented at Second Annual Meeting of the Vision Sciences Society (VSS 2002), Sarasota, FL, USA.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-DE71-9
Abstract
Subjectively, human perception of complex scenes appears to be fast and accurate but recent demonstrations of change blindness show that scene perception is not as complete as we think. Memory distortions have also been reported suggesting that the basis of apparent richness of our experience are perceptual schemas that allow fast perception of a scene without having to accurately encode everything in it. One such demonstration is Boundary Extension (BE; Intraub Richardson 1989; JEP:LMC, Vol.15, pp179–187) in which observers appear to remember a greater expanse of a scene than was actually shown. For instance, if they are shown a close-up photograph of a child sitting on the stairs, they will later identify a wider-angle view of the scene as the original; suggesting the use of top-down extrapolation. The wider field of view apparent in BE is also consistent with an overestimation of viewing distance to the main subject of a scene. In our experiments we use interactive computer graphics in order to determine the origins of BE in 3D scenes. Subjects fixate one of 12 computer generated 3D representations of common objects that can be situated within a richly decorated setting of 3D virtual room. Initial 1s presentations depict these objects from one of three viewing distances (close-up, middle-distance and long-distance). After a 5s retention interval subjects are allowed to recreate their original view of the objects using a 3D joy-stick. Analysis of settings suggests that a perceptual schema explanation is too restrictive. Subjects' recreated views are a function of the original simulated viewing distance with under-estimation when the original view was long-distance and over-estimation (BE) when the viewing distance is close-up. Furthermore, results are also affected to differing degrees depending on whether the objects are initially viewed within the 3D setting and whether the setting is visible during subjects' recreation of the original view.