hide
Free keywords:
-
Abstract:
Our brains are continuously confronted with the problem of how to understand the sensory signals with which they are bombarded. For example, I can hear a bird and I can see a bird, but is it one bird singing on the branch, or is it two birds: one sitting on the branch and the other singing in the bush? How should the brain combine signals into a veridical percept of the environment without knowing whether they pertain to same or different events? Combining Bayesian Modelling with fMRI and EEG multivariate decoding we investigated how the brain solves this so-called Causal Inference problem. We demonstrate that the human brain integrates sensory signals into spatial representations in line with Bayesian Causal Inference by simultaneously encoding multiple spatial estimates along the cortical hierarchy. Critically, only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the world’s causal structure is taken into account and sensory signals are integrated weighted by their bottom-up sensory reliability and top-down task-relevance into spatial priority maps as predicted by Bayesian Causal Inference. Characterizing the computational operations of multisensory interactions in human neocortex reveals the hierarchical nature of multisensory perception.