hide
Free keywords:
-
Abstract:
When we listen to someone in a noisy environment, such as in a noisy restaurant, it helps to
look at the person’s lips to better understand what the person is saying. This behavioural benefit is
thought to be based on the interaction of auditory and visual processing mechanisms in the human
brain. Surprisingly, interaction of audio-visual processing mechanisms also occurs when there is no
visual signal, for example when listening to someone on the phone. In this talk I will present
evidence from behavioural and neuroimaging experiments that show that this active involvement of
visual cortices during auditory-only perception improves understanding what is said and recognising
who is talking.