Investigating the interaction between multisensory processing and attention in time during auditory scene analysis

Jennifer Bizley (primary)
Ear Institute
UCL
Adam Tierny (secondary)
Psychological Sciences
Birkbeck

Abstract

Young normal hearing listeners have a remarkable ability to listen selectively to one sound in a mixture as exemplified by our ability to understand the speech of a friend in a noisy café. To achieve this, listeners must be able to segregate competing sounds and then use attention to “select” task-relevant sounds. Seeing complementary visual information – such as mouth movements – may help listeners use temporal structure to segregate and select sound streams. This project will determine how vision augments auditory selective attention at multiple scales, by combining human behaviour and EEG recordings with single unit recordings in an animal model.


References

Laffere et al. “Effects of auditory selective attention on neural phase: individual differences and short-term training” NeuroImage, 2020.
Atilgan and Bizley “Training enhances the ability of listeners to exploit visual information for auditory scene analysis” Cognition 2021
Atilgan et al., ” Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding” Neuron, 2018.
Laffere et al. “Attentional modulation of neural entrainment to sound streams in children with and without ADHD” Neuroimage, 2021
Maddox et al., “Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners” 2015,


BBSRC Area
Animal disease, health and welfare
Area of Biology
NeurobiologyPhysiology
Techniques & Approaches
Mathematics / StatisticsMicroscopy / ElectrophysiologySimulation / Modelling