English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Examining the neural bases of looking and seeing in visual search using event-related potentials

MPS-Authors
/persons/resource/persons265904

Liang,  J       
Department of Sensory and Sensorimotor Systems, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons226321

Zhaoping,  L       
Department of Sensory and Sensorimotor Systems, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Liang, J., & Zhaoping, L. (2024). Examining the neural bases of looking and seeing in visual search using event-related potentials. Poster presented at 46th European Conference on Visual Perception (ECVP 2024), Aberdeen, UK.


Cite as: https://hdl.handle.net/21.11116/0000-000F-C13B-5
Abstract
Looking and seeing are separate processes that typically overlap in time. To examine their underlying neural bases, we use event-related potentials (ERPs) on a visual search task (Zhaoping and Guyader 2007) that enables us to separate the two processes. Observers searched for a uniquely oriented bar among many identically shaped bars in each of many images, each could be of type A-simple, B-simple, A, or B. In A-simple images, all bars are oriented 45 degrees clockwise or anti-clockwise from vertical; the target is uniquely oriented. Modifying A-simple images gives A images, when to each original bar is added an intersecting horizontal or vertical bar to make an 'X'. All the 'X's are identical to, although rotated from, each other, confusing normal observers and prolonging their response times to report the target. In this confusion, gaze position during search often reaches, but then abandons, the target to continue searching elsewhere. Modifying A images gives B images, when the target bar's orientation tilts just 20 degrees from the intersecting horizontal/vertical bar, making the resulting 'X' distinctly thinner, eliminating the confusion. Removing all the horizontal/vertical bars from B images gives B-simple images. Observers' gaze movements and electroencephalography (EEG) were recorded. We aim to identify ERP components for looking and seeing typically associated with peripheral and central vision. This can be examined by, e.g., contrasting the EEG waves between A and B images (since A and B share the same target saliency effects to guide looking but differ in confusion caused by seeing) and contrasting the EEG waves for targets near or far from the initial fixation location before the stimulus onset. Preliminary behavioral and EEG data suggest that our experimental design, with further data collection and analysis, should provide substantial information that we plan to report at the conference.