PhD Project 12
Neural processing of action, gesture and language in healthy and autistic individuals
It is often suggested that language and action on the one hand, and language and gestures on the other hand, share similar neurocognitive processes. We want to investigate in more detail if this is true. We will use fMRI repetition suppression paradigm to identify the shared neuronal substrate for processing communicative and semantic information across these modalities. We will further explore the features of the cross-modal processing in autism. Finally, interpersonal differences in the processing of the communicative component in normal population will be assessed. The project will inform on the neural mechanism and relation between the different communicative modalities.
This project is closely linked to project 9, both exploring the communicative aspects of actions. This project focuses on the neural processes allowing an addressee to understand the message communicated through an action or gesture, while project 9 seeks to explore neural basis of communicative actions from the perspective of action preparation and control.
A first paper was completed, in which kinematics of communicative vs. non-communicative actions and gestures are compared and tested for their role in supporting recognition of communicative intent. Several behavioural studies were completed in which we systematically test kinematics for their role in supporting semantic comprehension of gestures. Manuscript for semantic studies is in preparation. Validation of Kinect for automatic coding of kinematic features of gestures using independent coders has begun. Adapting Kinect to action and gesture research will open new innovations for research and applied tools in the context of this consortium. Current effort is mainly focused on the completion of this validation study, and developing an imaging experiment to examine how (production) kinematics contribute to recognition of communicative intent in observers.