English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Early decreases in alpha and gamma band power distinguish linguistic from visual information during spoken sentence comprehension

MPS-Authors
/persons/resource/persons139

Oostenveld,  Robert
Neurobiology of Language Group, MPI for Psycholinguistics, Max Planck Society;
Unification, MPI for Psycholinguistics, Max Planck Society;

/persons/resource/persons69

Hagoort,  Peter
Neurobiology of Language Group, MPI for Psycholinguistics, Max Planck Society;
FC Donders Centre for Cognitive Neuroimaging, external;
Unification, MPI for Psycholinguistics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (public)

Willems_2008_early.pdf
(Publisher version), 2MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Willems, R. M., Oostenveld, R., & Hagoort, P. (2008). Early decreases in alpha and gamma band power distinguish linguistic from visual information during spoken sentence comprehension. Brain Research, 1219, 78-90. doi:10.1016/j.brainres.2008.04.065.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-1F46-5
Abstract
Language is often perceived together with visual information. This raises the question on how the brain integrates information conveyed in visual and/or linguistic format during spoken language comprehension. In this study we investigated the dynamics of semantic integration of visual and linguistic information by means of time-frequency analysis of the EEG signal. A modified version of the N400 paradigm with either a word or a picture of an object being semantically incongruous with respect to the preceding sentence context was employed. Event-Related Potential (ERP) analysis showed qualitatively similar N400 effects for integration of either word or picture. Time-frequency analysis revealed early specific decreases in alpha and gamma band power for linguistic and visual information respectively. We argue that these reflect a rapid context-based analysis of acoustic (word) or visual (picture) form information. We conclude that although full semantic integration of linguistic and visual information occurs through a common mechanism, early differences in oscillations in specific frequency bands reflect the format of the incoming information and, importantly, an early context-based detection of its congruity with respect to the preceding language context