English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Decoding the direction of implied motion in human early visual cortex

MPS-Authors
/persons/resource/persons215763

Altan,  G
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83797

Bartels,  A
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Altan, G., & Bartels, A. (2018). Decoding the direction of implied motion in human early visual cortex. Poster presented at 48th Annual Meeting of the Society for Neuroscience (Neuroscience 2018), San Diego, CA, USA.


Cite as: https://hdl.handle.net/21.11116/0000-0002-6326-4
Abstract
Implied motion perception is a striking case of our capacity to infer motion features from static pictures that imply movement. At a higher, cognitive level, the mere configuration of an object (such as a snapshot of a walking human) can imply motion in a directional way. Previous studies have shown that implied motion processing recruits direction selective neurons and activates cortical motion processing regions. However, it is unknown whether object-processing regions or early visual regions are involved in implied motion processing. In the present study we used fMRI and multivariate pattern classification to examine which human brain regions differentiate implicit direction information in static images of implied motion. We hence examined BOLD ac­tivity patterns within independently defined early visual (V1-V3), motion (V5+/MT+) and object-processing (LO1, LO2) regions when participants viewed still images with directional implied motion (rightward vs. leftward). The stimuli contained both animate (birds) and inanimate (airplanes, cars) objects as sources of implied motion. The objects were presented at the center of the visual field on a horizontally blurred background in the periphery. We found that response patterns in visual areas V2, V3, human motion complex V5+/MT+, and object responsive region LO2 coded for the direction of the implied motion stimuli significantly better than chance. Decoding in visual areas V1 and LO1 was at chance level. We then examined decoding in retinotopically defined foveal and peripheral representations of V1-V3. Only the foveal representation was stimulated by the foreground objects, the periphery by blurred background. We found that peripheral V1-V3 allowed decoding of implied motion directions, while foveal representations did not. Hence, high-level information of implied motion directionality is represented in peripheral V1-V3, i.e. regions that were never given the information through bottom-up stimulation. This suggests that higher-level cognitive processes (potentially based in LO2, V5+/MT+) detect implied motion direction based on object configuration and feed it back to cover the peripheral context in early visual cortex, potentially encoding expected background-motion. The results provide direct evidence for information in early visual cortex originating from feedback, compatible with predictive coding theory.