Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

Beat that word: How listeners integrate beat gesture and focus in multimodal speech discourse

MPG-Autoren
/persons/resource/persons23574

Chu,  Mingyuan
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;
University of Aberdeen;

/persons/resource/persons142

Ozyurek,  Asli
Radboud University;
Research Associates, MPI for Psycholinguistics, Max Planck Society;
Multimodal Language and Cognition, Radboud University Nijmegen, External Organizations;

/persons/resource/persons69

Hagoort,  Peter
Donders Institute for Brain, Cognition and Behaviour, External Organizations;
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)

Dimitrova_etal_JOCN_2016.pdf
(Verlagsversion), 886KB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Dimitrova, D. V., Chu, M., Wang, L., Ozyurek, A., & Hagoort, P. (2016). Beat that word: How listeners integrate beat gesture and focus in multimodal speech discourse. Journal of Cognitive Neuroscience, 28(9), 1255-1269. doi:10.1162/jocn_a_00963.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-002A-05FC-1
Zusammenfassung
Communication is facilitated when listeners allocate their attention to important information (focus) in the message, a process called "information structure." Linguistic cues like the preceding context and pitch accent help listeners to identify focused information. In multimodal communication, relevant information can be emphasized by nonverbal cues like beat gestures, which represent rhythmic nonmeaningful hand movements. Recent studies have found that linguistic and nonverbal attention cues are integrated independently in single sentences. However, it is possible that these two cues interact when information is embedded in context, because context allows listeners to predict what information is important. In an ERP study, we tested this hypothesis and asked listeners to view videos capturing a dialogue. In the critical sentence, focused and nonfocused words were accompanied by beat gestures, grooming hand movements, or no gestures. ERP results showed that focused words are processed more attentively than nonfocused words as reflected in an N1 and P300 component. Hand movements also captured attention and elicited a P300 component. Importantly, beat gesture and focus interacted in a late time window of 600-900 msec relative to target word onset, giving rise to a late positivity when nonfocused words were accompanied by beat gestures. Our results show that listeners integrate beat gesture with the focus of the message and that integration costs arise when beat gesture falls on nonfocused information. This suggests that beat gestures fulfill a unique focusing function in multimodal discourse processing and that they have to be integrated with the information structure of the message.