hide
Free keywords:
-
Abstract:
Looking and seeing are separate processes that typically overlap in time. To examine their underlying neural bases, we use event-related potentials (ERPs) on a visual search task (Zhaoping and Guyader 2007) that enables us to separate the two processes. Observers searched for a uniquely oriented bar among many identically shaped bars in each of many images, each could be of type A-simple, B-simple, A, or B. In A-simple images, all bars are oriented 45 degrees clockwise or anti-clockwise from vertical; the target is uniquely oriented. Modifying A-simple images gives A images, when to each original bar is added an intersecting horizontal or vertical bar to make an 'X'. All the 'X's are identical to, although rotated from, each other, confusing normal observers and prolonging their response times to report the target. In this confusion, gaze position during search often reaches, but then abandons, the target to continue searching elsewhere. Modifying A images gives B images, when the target bar's orientation tilts just 20 degrees from the intersecting horizontal/vertical bar, making the resulting 'X' distinctly thinner, eliminating the confusion. Removing all the horizontal/vertical bars from B images gives B-simple images. Observers' gaze movements and electroencephalography (EEG) were recorded. We aim to identify ERP components for looking and seeing typically associated with peripheral and central vision. This can be examined by, e.g., contrasting the EEG waves between A and B images (since A and B share the same target saliency effects to guide looking but differ in confusion caused by seeing) and contrasting the EEG waves for targets near or far from the initial fixation location before the stimulus onset. Preliminary behavioral and EEG data suggest that our experimental design, with further data collection and analysis, should provide substantial information that we plan to report at the conference.