Help Privacy Policy Disclaimer
  Advanced SearchBrowse


  Task-dependent modulation of the visual sensory thalamus assists visual-speech recognition

Díaz, B., Blank, H., & von Kriegstein, K. (2018). Task-dependent modulation of the visual sensory thalamus assists visual-speech recognition. NeuroImage, 178, 721-734. doi:10.1016/j.neuroimage.2018.05.032.

Item is


show Files




Díaz, Begoña1, 2, 3, Author           
Blank, Helen2, 4, Author           
von Kriegstein, Katharina2, 5, Author           
1Center for Brain and Cognition, University Pompeu Fabra, Barcelona, Spain, ou_persistent22              
2Max Planck Research Group Neural Mechanisms of Human Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society, Leipzig, DE, ou_634556              
3Department of Basic Sciences, Faculty of Medicine and Health Sciences, International University of Catalonia, Barcelona, Spain, ou_persistent22              
4University Medical Center Hamburg-Eppendorf, Germany, ou_persistent22              
5Faculty of Psychology, TU Dresden, Germany, ou_persistent22              


Free keywords: Functional MRI; Lateral geniculate nucleus; Lipreading; Speech
 Abstract: The cerebral cortex modulates early sensory processing via feed-back connections to sensory pathway nuclei. The functions of this top-down modulation for human behavior are poorly understood. Here, we show that top-down modulation of the visual sensory thalamus (the lateral geniculate body, LGN) is involved in visual-speech recognition. In two independent functional magnetic resonance imaging (fMRI) studies, LGN response increased when participants processed fast-varying features of articulatory movements required for visual-speech recognition, as compared to temporally more stable features required for face identification with the same stimulus material. The LGN response during the visual-speech task correlated positively with the visual-speech recognition scores across participants. In addition, the task-dependent modulation was present for speech movements and did not occur for control conditions involving non-speech biological movements. In face-to-face communication, visual speech recognition is used to enhance or even enable understanding what is said. Speech recognition is commonly explained in frameworks focusing on cerebral cortex areas. Our findings suggest that task-dependent modulation at subcortical sensory stages has an important role for communication: Together with similar findings in the auditory modality the findings imply that task-dependent modulation of the sensory thalami is a general mechanism to optimize speech recognition.


Language(s): eng - English
 Dates: 2018-04-122017-09-212018-05-122018-05-142018-09
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1016/j.neuroimage.2018.05.032
PMID: 29772380
Other: Epub 2018
 Degree: -



Legal Case


Project information

show hide
Project name : -
Grant ID : -
Funding program : Max Planck Research Group Grant
Funding organization : -
Project name : The tiny and the fast: The role of subcortical sensory structures in human communication / SENSOCOM
Grant ID : 647051
Funding program : Horizon 2020
Funding organization : European Commission (EC)
Project name : -
Grant ID : JCI-2012-12678
Funding program : -
Funding organization : Juan de la Cierva fellowship
Project name : People Programme (Marie Curie Actions)
Grant ID : 32867
Funding program : Funding Programme 7
Funding organization : European Commission (EC)

Source 1

Title: NeuroImage
Source Genre: Journal
Publ. Info: Orlando, FL : Academic Press
Pages: - Volume / Issue: 178 Sequence Number: - Start / End Page: 721 - 734 Identifier: ISSN: 1053-8119
CoNE: https://pure.mpg.de/cone/journals/resource/954922650166