English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Neural correlates of phonetic adaptation as induced by lexical and audiovisual context

Ullas, S., Hausfeld, L., Cutler, A., Eisner, F., & Formisano, E. (2020). Neural correlates of phonetic adaptation as induced by lexical and audiovisual context. Journal of Cognitive Neuroscience, 32(11), 2145-2158. doi:10.1162/jocn_a_01608.

Item is

Files

show Files
hide Files
:
Ullas_etal_2020_Neural correlates of phonetic adaptation....pdf (Publisher version), 1023KB
Name:
Ullas_etal_2020_Neural correlates of phonetic adaptation....pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Ullas, Shruti1, Author
Hausfeld, Lars1, Author
Cutler, Anne2, 3, Author           
Eisner, Frank4, Author           
Formisano, Elia1, Author
Affiliations:
1Maastricht University, Maastricht, The Netherlands, ou_persistent22              
2Emeriti, MPI for Psycholinguistics, Max Planck Society, ou_2344699              
3MARCS Institute, Western Sydney University, Sydney, Australia, ou_persistent22              
4Radboud University Nijmegen, Nijmegen, The Netherlands, ou_persistent22              

Content

show
hide
Free keywords: -
 Abstract: When speech perception is difficult, one way listeners adjust is by reconfiguring phoneme category boundaries, drawing on contextual information. Both lexical knowledge and lipreading cues are used in this way, but it remains unknown whether these two differing forms of perceptual learning are similar at a neural level. This study compared phoneme boundary adjustments driven by lexical or audiovisual cues, using ultra-high-field 7-T fMRI. During imaging, participants heard exposure stimuli and test stimuli. Exposure stimuli for lexical retuning were audio recordings of words, and those for audiovisual recalibration were audio–video recordings of lip movements during utterances of pseudowords. Test stimuli were ambiguous phonetic strings presented without context, and listeners reported what phoneme they heard. Reports reflected phoneme biases in preceding exposure blocks (e.g., more reported /p/ after /p/-biased exposure). Analysis of corresponding brain responses indicated that both forms of cue use were associated with a network of activity across the temporal cortex, plus parietal, insula, and motor areas. Audiovisual recalibration also elicited significant occipital cortex activity despite the lack of visual stimuli. Activity levels in several ROIs also covaried with strength of audiovisual recalibration, with greater activity accompanying larger recalibration shifts. Similar activation patterns appeared for lexical retuning, but here, no significant ROIs were identified. Audiovisual and lexical forms of perceptual learning thus induce largely similar brain response patterns. However, audiovisual recalibration involves additional visual cortex contributions, suggesting that previously acquired visual information (on lip movements) is retrieved and deployed to disambiguate auditory perception.

Details

show
hide
Language(s): eng - English
 Dates: 2020-07-142020-10
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1162/jocn_a_01608
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Journal of Cognitive Neuroscience
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Cambridge, MA : MIT Press Journals
Pages: - Volume / Issue: 32 (11) Sequence Number: - Start / End Page: 2145 - 2158 Identifier: ISSN: 0898-929X
CoNE: https://pure.mpg.de/cone/journals/resource/991042752752726