hide
Free keywords:
audition, effective connectivity, magnetoencephalography, predictive processing, source reconstruction, speech processing
Abstract:
Adaptive behavior rests on predictions based on statistical regularities in the environment. Such regularities pertain to stimulus contents (“what”) and timing (“when”), and both interactively modulate sensory processing. In speech streams, predictions can be formed at multiple hierarchical levels of contents (e.g., syllables vs words) and timing (faster vs slower time scales). Whether and how these hierarchies map onto each other remains unknown. Under one hypothesis, neural hierarchies may link “what” and “when” predictions within sensory processing areas: with lower versus higher cortical regions mediating interactions for smaller versus larger units (syllables vs words). Alternatively, interactions between “what” and “when” regularities might rest on a generic, sensory-independent mechanism. To address these questions, we manipulated “what” and “when” regularities at two levels—single syllables and disyllabic pseudowords—while recording neural activity using magnetoencephalography (MEG) in healthy volunteers (N = 22). We studied how neural responses to syllable and/or pseudoword deviants are modulated by “when” regularity. “When” regularity modulated “what” mismatch responses with hierarchical specificity, such that responses to deviant pseudowords (vs syllables) were amplified by temporal regularity at slower (vs faster) time scales. However, both these interactive effects were source-localized to the same regions, including frontal and parietal cortices. Effective connectivity analysis showed that the integration of “what” and “when” regularity selectively modulated connectivity within regions, consistent with gain effects. This suggests that the brain integrates “what” and “when” predictions that are congruent with respect to their hierarchical level, but this integration is mediated by a shared and distributed cortical network.