Creating a shared cognitive space: How is language grounded in and shaped by communicative settings of interacting people?
Language is a key socio-cognitive human function predominantly used in interaction. Yet, linguistics and cognitive neuroscience have largely focused on individuals’ coding-decoding signals according to their structural dependencies. Understanding the communicative use of language requires shifting the focus of investigation to the mechanisms used by interlocutors to share a conceptual space.
This big question considers the influence of two dimensions over multiple communicative resources (speech, gestures, gaze) and linguistic structures (from phonology to pragmatics), namely the temporal structure of communicative interactions and the functional dynamics of real-life communicative interactions.
There is deep collaboration between all BQ3 subprojects. The qualitative results that follow from the simulation studies will be related to the empirical findings from the other subprojects and vice versa, the empirical observations from the other subprojects will inspire the qualitative hypotheses to be tested. The cognitive agent-based simulation studies go beyond the empirical paradigm in the BQ3 project, because they allow us to test for qualitative differences in interactive behaviour by manipulating the cognitive capacities of the agents—something that is difficult to do with human test subjects—while simultaneously leading to explicit theories of computational mechanisms.
Prof. dr. Mirjam Ernestus
Prof. dr. Asli Ozyurek
Prof. dr. Iris van Rooij
Dr. Jan-Mathijs Schoffelen
Dr. Sara Bögels
Dr. Marieke Woensdregt
Research Highlights (2021)
Creating shared (neural) representations
Team members: Sara Bögels, Jan-Mathijs Schoffelen, Branka Milivojevic, and Ivan Toni
This project shares multimodal data from face-to-face interactive communication in pairs of participants playing a cooperative referential communication game with novel objects. The data offer high-quality speech, video, and body-motion tracking during communication, as well as the possibility of quantifying effects of communication on neuro-behavioural correlates of object representations. The unique combination of measurements allows for new insights in human communication.
It considers the contribution of multimodal communicative resources (speech, gestures) at different levels of linguistic structure (from phonology to pragmatics) during interactive task-based dialogue, where each communicative pair needs to create mutually understood utterances, dependent on the situated context of the ongoing interaction. We share with the community a dataset involving interactional and individual data. The interactional part of the dataset consists of video-, audio-, and body movement recordings of face-to-face communicative interactions in Dutch between 71 pairs of participants, without restrictions on communicative means (e.g., speech, gestures), timing, turn-taking, or feedback. Participants communicate about 16 novel visual objects which lack conventional labels – called “Fribbles” (see Figure 1). The individual component of the dataset provides estimates of participants’ representations of the Fribbles using two behavioural measures and one neuroimaging (fMRI) measure.
The dataset provided in this project offers the possibility to quantify the degree to which participants align their behaviour at different levels of analysis (phonetic, lexical, syntactic, semantic, pragmatic or gestural). It is also possible to examine changes in the estimated representations of the Fribbles from pre- to post-interaction, as well as potential convergence in representations between participants, on several levels (i.e., lexical, behavioural, neural). This project is a direct result of BQ3’s multidisciplinary approach, integrating different aspects of human communication in a comprehensive dataset relevant for a wide range of disciplines.