English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Speech comprehension aided by multiple modalities: Behavioural and neural interactions

McGettigan, C., Faulkner, A., Altarelli, I., Obleser, J., Baverstock, H., & Scott, S. K. (2012). Speech comprehension aided by multiple modalities: Behavioural and neural interactions. Neuropsychologia, 50(5), 762-776. doi:10.1016/j.neuropsychologia.2012.01.010.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-000F-0253-6 Version Permalink: http://hdl.handle.net/21.11116/0000-0004-5C20-1
Genre: Journal Article

Files

show Files
hide Files
:
McGettigan_2012_Speech.pdf (Publisher version), 2MB
 
File Permalink:
-
Name:
McGettigan_2012_Speech.pdf
Description:
-
Visibility:
Private
MIME-Type / Checksum:
application/pdf
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
McGettigan, Carolyn1, Author
Faulkner, Andrew2, Author
Altarelli, Irene2, 3, Author
Obleser, Jonas4, Author              
Baverstock, Harriet5, Author
Scott, Sophie K.1, Author
Affiliations:
1Institute of Cognitive Neuroscience, University College London, United Kingdom, ou_persistent22              
2Department of Speech, Hearing & Phonetic Sciences, University College London, United Kingdom, ou_persistent22              
3Laboratoire de Sciences Cognitives and Psycholinguistique, École normale supérieure, Paris, France, ou_persistent22              
4Max Planck Research Group Auditory Cognition, MPI for Human Cognitive and Brain Sciences, Max Planck Society, ou_751545              
5School of Psychological Sciences, University of Manchester, United Kingdom, ou_persistent22              

Content

show
hide
Free keywords: Speech; fMRI; Auditory cortex; Individual differences
 Abstract: Speech comprehension is a complex human skill, the performance of which requires the perceiver to combine information from several sources – e.g. voice, face, gesture, linguistic context – to achieve an intelligible and interpretable percept. We describe a functional imaging investigation of how auditory, visual and linguistic information interact to facilitate comprehension. Our specific aims were to investigate the neural responses to these different information sources, alone and in interaction, and further to use behavioural speech comprehension scores to address sites of intelligibility-related activation in multifactorial speech comprehension. In fMRI, participants passively watched videos of spoken sentences, in which we varied Auditory Clarity (with noise-vocoding), Visual Clarity (with Gaussian blurring) and Linguistic Predictability. Main effects of enhanced signal with increased auditory and visual clarity were observed in overlapping regions of posterior STS. Two-way interactions of the factors (auditory × visual, auditory × predictability) in the neural data were observed outside temporal cortex, where positive signal change in response to clearer facial information and greater semantic predictability was greatest at intermediate levels of auditory clarity. Overall changes in stimulus intelligibility by condition (as determined using an independent behavioural experiment) were reflected in the neural data by increased activation predominantly in bilateral dorsolateral temporal cortex, as well as inferior frontal cortex and left fusiform gyrus. Specific investigation of intelligibility changes at intermediate auditory clarity revealed a set of regions, including posterior STS and fusiform gyrus, showing enhanced responses to both visual and linguistic information. Finally, an individual differences analysis showed that greater comprehension performance in the scanning participants (measured in a post-scan behavioural test) were associated with increased activation in left inferior frontal gyrus and left posterior STS. The current multimodal speech comprehension paradigm demonstrates recruitment of a wide comprehension network in the brain, in which posterior STS and fusiform gyrus form sites for convergence of auditory, visual and linguistic information, while left-dominant sites in temporal and frontal cortex support successful comprehension.

Details

show
hide
Language(s): eng - English
 Dates: 2012-01-082012-01-162012-04
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: Peer
 Identifiers: DOI: 10.1016/j.neuropsychologia.2012.01.010
PMID: 22266262
PMC: PMC4050300
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Neuropsychologia
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 50 (5) Sequence Number: - Start / End Page: 762 - 776 Identifier: ISSN: 0028-3932
CoNE: https://pure.mpg.de/cone/journals/resource/954925428258