English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

The left prefrontal cortex controls information integration by combining bottom-up inputs and top-down predictions

MPS-Authors
/persons/resource/persons214667

Gau,  R
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource

Link
(Publisher version)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Gau, R., & Noppeney, U. (2013). The left prefrontal cortex controls information integration by combining bottom-up inputs and top-down predictions. Poster presented at 43rd Annual Meeting of the Society for Neuroscience (Neuroscience 2013), San Diego, CA, USA.


Cite as: https://hdl.handle.net/21.11116/0000-0001-4E27-D
Abstract
In the natural environment our senses are bombarded with many different signals. To form a coherent percept, the brain should integrate signals originating from a common source and segregate signals from different sources. This psychophysics-fMRI study investigated how the human brain combines bottom-up inputs (i.e. congruent VS incongruent signals) and top-down prior predictions (i.e. common source prior) to infer whether sensory signals should be integrated or segregated. Sixteen participants were shown audio-visual movies of congruent (e.g. visual «Ti» with auditory /Ti/), incongruent (e.g. visual «Ti» with auditory /Pi/) and McGurk syllables (e.g. visual «Ki» with auditory /Pi/, which can be fused into the illusionary percept “Ti”). Critically, we manipulated participants' top-down predictions (i.e. common source prior) by presenting the McGurk stimuli in a series of congruent or incongruent syllables. On each trial, participants reported their syllable percept in forced choice procedure with 6 response options. At the behavioural level, participants were more likely to fuse auditory and visual signals of a McGurk trial into an illusionary percept in congruent relative to incongruent contexts. This response profile indicates that participant's prior top-down predictions (i.e. common source prior) influence whether or not they integrate sensory signals into a coherent percept. At the neural level, incongruent relative to congruent bottom-up inputs increased activations in a widespread left-lateralised fronto-parietal network. The left prefrontal activations also increased for McGurk trials, when participants selectively reported their auditory percept and did not fuse auditory and visual McGurk signals into a unified percept. Critically, this effect was enhanced for incongruent contexts when participants expected that sensory signals are incongruent and needed to be segregated. Collectively, our results demonstrate that the left inferior frontal sulcus determines whether sensory signals should be integrated or segregated by combining (i) top-down predictions generated from prior incongruent trials with (ii) bottom-up information about sensory conflict in the incoming signals. Furthermore, it exerts top-down control that enables participants to process sensory signals independently and selectively report their percept in one sensory (i.e. here auditory) modality.