PhD Project 8
Giving speech a hand: how functional brain networks support gestural enhancement of language
Face-to-face communication involves audiovisual binding of speech and gesture, both carrying semantic information to varying degrees. Using MEG, we propose for the first time to investigate how oscillatory neural interactions in an extended brain network reflect the integration of gesture and speech information and its time course when gestures can have enhancement effects; (a) during comprehension of degraded speech and (b) for subsequent memory of newly learned words. Results will integrate previous findings on the role of oscillations in speech comprehension, memory, and action observation and provide insights into how brain networks adjust to processing audiovisual input involving differential semantic information.
This project seeks to investigate for the first time how oscillatory neural interactions reflect the integration of gesture and speech. The project promises to provide novel insight into the physiological substrate of the interaction between gesture and speech perception.
The first behavioural paper is published. An EEG study on how native and non-native listeners of Dutch integrate speech and gestures is conducted and the article is submitted. First part of the MEG data is collected, analysed and written up in a paper.
Another part of the initial MEG study is now being analysed and written up in a paper (on how matching and mismatching gestures in clear and adverse listening conditions modulate oscillatory dynamics in language, motor and visual areas). An extra behavioural study that investigated at what noise-vocoding level German listeners benefit most from visual speech and gestural information was conducted and analysed.