Other Interdiciplinary Incentives
Keywords
Clinical Conditions
Equipment & Techniques
Science Culture
Speech perception is inherently multisensory. When engaged in a face-to-face conversation, we can use multidimensional cues encoded in facial movements to enhance our perception and comprehension. In my PhD research, I investigate the cognitive and neural mechanisms underlying the successful integration of visual and acoustic speech signals. To this end, I work with large-scale behavioural experiments (conducted online), data-driven analyses of multimodal corpora, and perform multivariate model-based analyses of neuroimaging (M/EEG) data.