English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Interleaved lexical and audiovisual information can retune phoneme boundaries

Ullas, S., Formisano, E., Eisner, F., & Cutler, A. (2020). Interleaved lexical and audiovisual information can retune phoneme boundaries. Attention, Perception & Psychophysics, 82, 2018-2026. doi:10.3758/s13414-019-01961-8.

Item is

Files

show Files
hide Files
:
Ullas2020_Article_InterleavedLexicalAndAudiovisu.pdf (Publisher version), 748KB
Name:
Ullas2020_Article_InterleavedLexicalAndAudiovisu.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Ullas, Shruti1, Author
Formisano, Elia1, Author
Eisner, Frank2, Author           
Cutler, Anne3, 4, Author           
Affiliations:
1Maastricht University, Maastricht, The Netherlands, ou_persistent22              
2Donders Institute for Brain, Cognition and Behaviour, External Organizations, ou_55236              
3Emeriti, MPI for Psycholinguistics, Max Planck Society, ou_2344699              
4MARCS Institute and ARC Centre of Excellence for the Dynamics of Language, Western Sydney University, Penrith, Australia, ou_persistent22              

Content

show
hide
Free keywords: -
 Abstract: To adapt to situations in which speech perception is difficult, listeners can adjust boundaries between phoneme categories using perceptual learning. Such adjustments can draw on lexical information in surrounding speech, or on visual cues via speech-reading. In the present study, listeners proved they were able to flexibly adjust the boundary between two plosive/stop consonants, /p/-/t/, using both lexical and speech-reading information and given the same experimental design for both cue types. Videos of a speaker pronouncing pseudo-words and audio recordings of Dutch words were presented in alternating blocks of either stimulus type. Listeners were able to switch between cues to adjust phoneme boundaries, and resulting effects were comparable to results from listeners receiving only a single source of information. Overall, audiovisual cues (i.e., the videos) produced the stronger effects, commensurate with their applicability for adapting to noisy environments. Lexical cues were able to induce effects with fewer exposure stimuli and a changing phoneme bias, in a design unlike most prior studies of lexical retuning. While lexical retuning effects were relatively weaker compared to audiovisual recalibration, this discrepancy could reflect how lexical retuning may be more suitable for adapting to speakers than to environments. Nonetheless, the presence of the lexical retuning effects suggests that it may be invoked at a faster rate than previously seen. In general, this technique has further illuminated the robustness of adaptability in speech perception, and offers the potential to enable further comparisons across differing forms of perceptual learning.

Details

show
hide
Language(s): eng - English
 Dates: 2020-01-222020
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.3758/s13414-019-01961-8
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Attention, Perception & Psychophysics
  Abbreviation : Atten Percept Psychophys
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Psychonomic Society
Pages: - Volume / Issue: 82 Sequence Number: - Start / End Page: 2018 - 2026 Identifier: ISSN: 1943-3921
CoNE: https://pure.mpg.de/cone/journals/resource/1943-3921