English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color

MPS-Authors
/persons/resource/persons83794

Bannert,  M
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83797

Bartels,  A
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Bannert, M., & Bartels, A. (2018). Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color. Poster presented at 18th Annual Meeting of the Vision Sciences Society (VSS 2018), St. Pete Beach, FL, USA.


Cite as: https://hdl.handle.net/21.11116/0000-0001-7DE6-0
Abstract
Among the multitude of elements making up visual experience, color stands out in that it can specify both subjective experience and objective properties of the outside world. Whereas most neuroimaging research on human color vision has focused on external stimulation, the present study addressed this duality by investigating how externally elicited color vision is linked to subjective color experience induced by object imagery. We recorded fMRI activity while showing our participants abstract color stimuli that were either red, green, or yellow in half of the runs (“real-color runs”) and asked them to produce mental images of colored objects corresponding to the same three categories in the remaining half (“imagery runs”). To make sure that participants were engaged in visual imagery, they performed a 1-back same/different color judgment task on the imagined objects. We trained color classifiers using MVPA to distinguish between fMRI responses to the three color stimuli and cross-validated them on data from real-color or imagery runs. Although real-color percepts could be predicted from all retinotopically mapped visual areas, only color decoders trained on hV4 responses could additionally predict the color category of an object that was being imagined. This suggests that sensory-driven and self-induced colors share a common neural code in hV4. Using a hierarchical drift diffusion model, we furthermore demonstrated that the decoding accuracy in hV4 was predictive of performance in the color judgment task on a trial-by-trial basis. The commonality between neural representations of perceived and imagined object color, in combination with the behavioral modeling evidence, hence identifies area hV4 as a “perceptual bridge” linking externally triggered color vision with color in self-generated object imagery.