Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

Task-dependent modulation of the visual sensory thalamus assists visual-speech recognition

MPG-Autoren
/persons/resource/persons19605

Díaz,  Begoña
Center for Brain and Cognition, University Pompeu Fabra, Barcelona, Spain;
Max Planck Research Group Neural Mechanisms of Human Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
Department of Basic Sciences, Faculty of Medicine and Health Sciences, International University of Catalonia, Barcelona, Spain;

/persons/resource/persons19555

Blank,  Helen
Max Planck Research Group Neural Mechanisms of Human Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
University Medical Center Hamburg-Eppendorf, Germany;

/persons/resource/persons20071

von Kriegstein,  Katharina
Max Planck Research Group Neural Mechanisms of Human Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
Faculty of Psychology, TU Dresden, Germany;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)

Díaz_2018.pdf
(Preprint), 2MB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Díaz, B., Blank, H., & von Kriegstein, K. (2018). Task-dependent modulation of the visual sensory thalamus assists visual-speech recognition. NeuroImage, 178, 721-734. doi:10.1016/j.neuroimage.2018.05.032.


Zitierlink: https://hdl.handle.net/21.11116/0000-0001-AB5B-9
Zusammenfassung
The cerebral cortex modulates early sensory processing via feed-back connections to sensory pathway nuclei. The functions of this top-down modulation for human behavior are poorly understood. Here, we show that top-down modulation of the visual sensory thalamus (the lateral geniculate body, LGN) is involved in visual-speech recognition. In two independent functional magnetic resonance imaging (fMRI) studies, LGN response increased when participants processed fast-varying features of articulatory movements required for visual-speech recognition, as compared to temporally more stable features required for face identification with the same stimulus material. The LGN response during the visual-speech task correlated positively with the visual-speech recognition scores across participants. In addition, the task-dependent modulation was present for speech movements and did not occur for control conditions involving non-speech biological movements. In face-to-face communication, visual speech recognition is used to enhance or even enable understanding what is said. Speech recognition is commonly explained in frameworks focusing on cerebral cortex areas. Our findings suggest that task-dependent modulation at subcortical sensory stages has an important role for communication: Together with similar findings in the auditory modality the findings imply that task-dependent modulation of the sensory thalami is a general mechanism to optimize speech recognition.