Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Steady-state responses in MEG demonstrate information integration within but not across the auditory and visual senses

MPG-Autoren
/persons/resource/persons83933

Giani,  A
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84016

Erick O, Belardinelli P, Kleiner,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Giani, A., Erick O, Belardinelli P, Kleiner, M., Preissl, H., & Noppeney, U. (2011). Steady-state responses in MEG demonstrate information integration within but not across the auditory and visual senses. Poster presented at 12th Conference of Junior Neuroscientists of Tübingen (NeNA 2011), Heiligkreuztal, Germany.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-B9E6-7
Zusammenfassung
To form a unified percept of our environment, the human brain integrates information within and across the senses. This MEG study investigated interactions within and between sensory modalities using a frequency analysis of steady-Ââstate responses (SSR) to periodic auditory and/or visual inputs. The 3x3 factorial design, manipulated (1) modality (auditory only, visual only and audiovisual) and (2) temporal dynamics (static, dynamic1 and dynamic2). In the static conditions, subjects were presented with (1) visual gratings, luminance modulated at 6Hz and/or (2) pure tones, frequency modulated at 40 Hz. To manipulate perceptual synchrony, we imposed additional slow modulations on the auditory and visual stimuli either at same (0.2 Hz = synchronous) or different frequencies (0.2 Hz vs. 0.7 Hz = asynchronous). This also enabled us to investigate the integration of two dynamic features within one sensory modality (e.g. a pure tone frequency modulated at 40Hz amplitude modulated at 0.2Hz) in the dynamic conditions. We reliably identified crossmodulation frequencies when these two stimulus features were modulated at different frequencies. In contrast, no crossmodulation frequencies were identified when information needed to be combined from auditory and visual modalities. The absence of audiovisual crossmodulation frequencies suggests that the previously reported audiovisual interactions in primary sensory areas may mediate low level spatiotemporal coincidence detection that is prominent for stimulus transients but less relevant for sustained SSR responses. In conclusion, our results indicate that information in SSRs is integrated over multiple time scales within but not across sensory modalities at the primary cortical level.