English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Speech sound categorization: The contribution of non-auditory and auditory cortical regions

Preisig, B., Riecke, L., & Hervais-Adelman, A. (2022). Speech sound categorization: The contribution of non-auditory and auditory cortical regions. NeuroImage, 258: 119375. doi:10.1016/j.neuroimage.2022.119375.

Item is

Files

show Files
hide Files
:
Preisig_etal_2022suppl_Speech sound categorization.docx (Supplementary material), 193KB
Name:
figures and table
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/vnd.openxmlformats-officedocument.wordprocessingml.document / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-
:
Preisig_etal_2022_Speech sound categorization.pdf (Publisher version), 2MB
Name:
Preisig_etal_2022_Speech sound categorization.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
2022
Copyright Info:
© 2022 The Authors. Published by Elsevier Inc. This article is available under the Creative Commons CC-BY-NC-ND license and permits non-commercial use of the work as published, without adaptation or alteration provided the work is fully attributed.

Locators

show

Creators

show
hide
 Creators:
Preisig, Basil1, 2, 3, Author           
Riecke, Lars4, Author
Hervais-Adelman, Alexis3, Author           
Affiliations:
1Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society, ou_792551              
2Donders Institute for Brain, Cognition and Behaviour, External Organizations, ou_55236              
3University of Zurich, Zurich, Switzerland, ou_persistent22              
4Maastricht University, Maastricht, The Netherlands, ou_persistent22              

Content

show
hide
Free keywords: -
 Abstract: Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech perception is normally confounded by acoustic differences in the stimulus. By using ambiguous sounds, however, it is possible to dissociate acoustic from perceptual stimulus representations. Twenty-seven normally hearing individuals took part in an fMRI study in which they were presented with an ambiguous syllable (intermediate between /da/ and /ga/) in one ear and with disambiguating acoustic feature (third formant, F3) in the other ear. Multi-voxel pattern searchlight analysis was used to identify brain areas that consistently differentiated between response patterns associated with different syllable reports. By comparing responses to different stimuli with identical syllable reports and identical stimuli with different syllable reports, we disambiguated whether these regions primarily differentiated the acoustics of the stimuli or the syllable report. We found that BOLD activity patterns in left perisylvian regions (STG, SMG), left inferior frontal regions (vMC, IFG, AI), left supplementary motor cortex (SMA/pre-SMA), and right motor and somatosensory regions (M1/S1) represent listeners’ syllable report irrespective of stimulus acoustics. Most of these regions are outside of what is traditionally regarded as auditory or phonological processing areas. Our results indicate that the process of speech sound categorization implicates decision-making mechanisms and auditory-motor transformations.

Details

show
hide
Language(s): eng - English
 Dates: 2022-06-112022
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1016/j.neuroimage.2022.119375
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: NeuroImage
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 258 Sequence Number: 119375 Start / End Page: - Identifier: ISSN: 1053-8119
CoNE: https://pure.mpg.de/cone/journals/resource/954922650166