English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Recognizing facial speech in high-functioning ASD is associated with low functional connectivity in regions sensitive to facial motion

MPS-Authors
/persons/resource/persons103142

Borowiak,  Kamila
Max Planck Research Group Neural Mechanisms of Human Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
External Organizations;

/persons/resource/persons20071

von Kriegstein,  Katharina
Max Planck Research Group Neural Mechanisms of Human Communication, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
External Organizations;

External Resource
No external resources are shared
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Borowiak, K., & von Kriegstein, K. (2018). Recognizing facial speech in high-functioning ASD is associated with low functional connectivity in regions sensitive to facial motion. Poster presented at 11th Scientific Meeting for Autism Spectrum Conditions (WGAS), Frankfurt, Germany.


Cite as: http://hdl.handle.net/21.11116/0000-0002-063E-3
Abstract
Background: Individuals with autism spectrum disorders (ASD) have difficulties in perceiving moving faces and extracting social cues from them such as identity, emotion and speech (O’Brien et al., 2014; Sato et al., 2013; Foxe et al., 2015). Here, we investigated how facial motion-sensitive regions are functionally connected to facial form-sensitive regions during recognition of facial speech in ASD. Methods: Seventeen adults with high-functioning ASD and seventeen typically developed pair - wise matched controls participated. The experiment included a combined functional magnetic resonance imaging (fMRI) and eye-tracking experiment on facial-speech recognition and a functional localizer. In the facial-speech recognition experiment, participants viewed blocks of muted videos of speakers articulating syllables. We asked them to recognize either the articu- lated syllable (facial-speech task), or the identity of the articulating person (face-identity task). The functional localizer included viewing of moving and static faces and objects. Functional connectivity was assessed with psycho-physiological interaction analysis (PPI) based on the contrast „facial-speech task > face-identity task“. Seed regions were defined in the motion- sensitive STS/STG and V5/MT. We combined functional localizer approach with anatomical maps to define regions of interest (ROI) in the form-sensitive bilateral fusiform face area (FFA) and the bilateral occipital face area (OFA). Results: Compared to the control group, the ASD group had decreased functional connectivity between the motion-sensitive regions V5/MT and STS/STG, and the group differences were related to autistic traits (p< .0125 FWE-corrected for ROI). Functional connectivity between motion-sensitive and form-sensitive regions (FFA, OFA) was similar in the control and in the ASD group. Conclusions: Fast and accurate perception of moving faces is one of the prerequisites for successful face-to-face communication (O’Toole et al., 2002), and its impairments likely con- tribute to communication deficits typical for ASD. We provide evidence that difficulties in facial-speech recognition in ASD are related to dysfunctional mechanisms for facial-motion rather than facial-form perception.