Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color

MPG-Autoren
/persons/resource/persons83794

Bannert,  M
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83797

Bartels,  A
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Bannert, M., & Bartels, A. (2018). Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color. Poster presented at 18th Annual Meeting of the Vision Sciences Society (VSS 2018), St. Pete Beach, FL, USA.


Zitierlink: https://hdl.handle.net/21.11116/0000-0001-7DE6-0
Zusammenfassung
Among the multitude of elements making up visual experience, color stands out in that it can specify both subjective experience and objective properties of the outside world. Whereas most neuroimaging research on human color vision has focused on external stimulation, the present study addressed this duality by investigating how externally elicited color vision is linked to subjective color experience induced by object imagery. We recorded fMRI activity while showing our participants abstract color stimuli that were either red, green, or yellow in half of the runs (“real-color runs”) and asked them to produce mental images of colored objects corresponding to the same three categories in the remaining half (“imagery runs”). To make sure that participants were engaged in visual imagery, they performed a 1-back same/different color judgment task on the imagined objects. We trained color classifiers using MVPA to distinguish between fMRI responses to the three color stimuli and cross-validated them on data from real-color or imagery runs. Although real-color percepts could be predicted from all retinotopically mapped visual areas, only color decoders trained on hV4 responses could additionally predict the color category of an object that was being imagined. This suggests that sensory-driven and self-induced colors share a common neural code in hV4. Using a hierarchical drift diffusion model, we furthermore demonstrated that the decoding accuracy in hV4 was predictive of performance in the color judgment task on a trial-by-trial basis. The commonality between neural representations of perceived and imagined object color, in combination with the behavioral modeling evidence, hence identifies area hV4 as a “perceptual bridge” linking externally triggered color vision with color in self-generated object imagery.