日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

学術論文

Social eye gaze modulates processing of speech and co-speech gesture

MPS-Authors
/persons/resource/persons4512

Holler,  Judith
Language and Cognition Department, MPI for Psycholinguistics, Max Planck Society;
INTERACT, MPI for Psycholinguistics, Max Planck Society;
Communication in Social Interaction, Radboud University Nijmegen, External Organizations;

/persons/resource/persons71746

Schubotz,  Louise
International Max Planck Research School for Language Sciences, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL;
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;

/persons/resource/persons69

Hagoort,  Peter
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;

/persons/resource/persons81089

Schuetze,  Manuela
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;
Center for Language Studies , External Organizations;

/persons/resource/persons142

Ozyurek,  Asli
Center for Language Studies , External Organizations;
Research Associates, MPI for Psycholinguistics, Max Planck Society;
Multimodal Language and Cognition, Radboud University Nijmegen, External Organizations;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)

Holler et al_2014_social gaze.pdf
(出版社版), 568KB

付随資料 (公開)
There is no public supplementary material available
引用

Holler, J., Schubotz, L., Kelly, S., Hagoort, P., Schuetze, M., & Ozyurek, A. (2014). Social eye gaze modulates processing of speech and co-speech gesture. Cognition, 133, 692-697. doi:10.1016/j.cognition.2014.08.008.


引用: https://hdl.handle.net/11858/00-001M-0000-0023-BFCB-A
要旨
In human face-to-face communication, language comprehension is a multi-modal, situated activity. However, little is known about how we combine information from different modalities during comprehension, and how perceived communicative intentions, often signaled through visual signals, influence this process. We explored this question by simulating a multi-party communication context in which a speaker alternated her gaze between two recipients. Participants viewed speech-only or speech + gesture object-related messages when being addressed (direct gaze) or unaddressed (gaze averted to other participant). They were then asked to choose which of two object images matched the speaker’s preceding message. Unaddressed recipients responded significantly more slowly than addressees for speech-only utterances. However, perceiving the same speech accompanied by gestures sped unaddressed recipients up to a level identical to that of addressees. That is, when unaddressed recipients’ speech processing suffers, gestures can enhance the comprehension of a speaker’s message. We discuss our findings with respect to two hypotheses attempting to account for how social eye gaze may modulate multi-modal language comprehension.