日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

  Multi-modal language comprehension as a joint activity: The influence of eye gaze on the processing of speech and co-speech gesture in multi-party communication

Holler, J., Schubotz, L., Kelly, S., Hagoort, P., & Ozyurek, A. (2013). Multi-modal language comprehension as a joint activity: The influence of eye gaze on the processing of speech and co-speech gesture in multi-party communication. Talk presented at the 5th Joint Action Meeting. Berlin. 2013-07-26 - 2013-07-29.

Item is

基本情報

表示: 非表示:
資料種別: 講演

ファイル

表示: ファイル

関連URL

表示:

作成者

表示:
非表示:
 作成者:
Holler, Judith1, 2, 著者           
Schubotz, Louise, 著者
Kelly, Spencer, 著者
Hagoort, Peter3, 4, 著者           
Ozyurek, Asli5, 著者           
所属:
1Language and Cognition Department, MPI for Psycholinguistics, Max Planck Society, ou_792548              
2INTERACT, MPI for Psycholinguistics, Max Planck Society, Wundtlaan 1, 6525 XD Nijmegen, NL, ou_1863331              
3Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society, ou_792551              
4Donders Institute for Brain, Cognition and Behaviour, External Organizations, ou_55236              
5Center for Language Studies, External organization, ou_55238              

内容説明

表示:
非表示:
キーワード: -
 要旨: Traditionally, language comprehension has been studied as a solitary and unimodal activity. Here, we investigate language comprehension as a joint activity, i.e., in a dynamic social context involving multiple participants in different roles with different perspectives, while taking into account the multimodal nature of facetoface communication. We simulated a triadic communication context involving a speaker alternating her gaze between two different recipients, conveying information not only via speech but gesture as well. Participants thus viewed videorecorded speechonly or speech+gesture utterances referencing objects (e.g., “he likes the laptop”/+TYPING ON LAPTOPgesture) when being addressed (direct gaze) or unaddressed (averted gaze). The videoclips were followed by two object images (laptoptowel). Participants’ task was to choose the object that matched the speaker’s message (i.e., laptop). Unaddressed recipients responded significantly slower than addressees for speechonly utterances. However, perceiving the same speech accompanied by gestures sped them up to levels identical to that of addressees. Thus, when speech processing suffers due to being unaddressed, gestures become more prominent and boost comprehension of a speaker’s spoken message. Our findings illuminate how participants process multimodal language and how this process is influenced by eye gaze, an important social cue facilitating coordination in the joint activity of conversation.

資料詳細

表示:
非表示:
言語: eng - English
 日付: 2013
 出版の状態: 不明
 ページ: -
 出版情報: -
 目次: -
 査読: -
 識別子(DOI, ISBNなど): -
 学位: -

関連イベント

表示:
非表示:
イベント名: the 5th Joint Action Meeting
開催地: Berlin
開始日・終了日: 2013-07-26 - 2013-07-29

訴訟

表示:

Project information

表示:

出版物

表示: