Multisensory Perception
multisensory perception
Prof Virginie van Wassenhove
** Job application opened until filled ideally by the end of Feb. 2021** Applications are invited for two full-time post-doctoral cognitive neuroscientists in the European consortium “Extended-personal reality: augmented recording and transmission of virtual senses through artificial-intelligence” (see abstract p.2). EXPERIENCE involves eight academic and industrial partners with complementary expertise in artificial intelligence, neuroscience, psychiatry, neuroimaging, MEG/EEG/physiological recording techniques, and virtual-reality. The postdoctoral positions will be fully dedicated to the Scientific foundation for the Extended-Personal Reality, a work package lead by the CEA (Virginie van Wassenhove) in collaboration with Univ. of Pisa (Gaetano Valenza, Mateo Bianchi), Padova (Claudio Gentilli), Roma Tor Vergata (Nicola Toschi) and others… Full information here: https://brainthemind.files.wordpress.com/2021/01/experience_postdoctoral_adds.pdf
Multisensory perception in the metaverse
Bayesian expectation in the perception of the timing of stimulus sequences
In the current virtual journal club Dr Di Luca will present findings from a series of psychophysical investigations where he measured sensitivity and bias in the perception of the timing of stimuli. He will present how improved detection with longer sequences and biases in reporting isochrony can be accounted for by optimal statistical predictions. Among his findings was also that the timing of stimuli that occasionally deviate from a regularly paced sequence is perceptually distorted to appear more regular. Such change depends on whether the context these sequences are presented is also regular. Dr Di Luca will present a Bayesian model for the combination of dynamically updated expectations, in the form of a priori probability, with incoming sensory information. These findings contribute to the understanding of how the brain processes temporal information to shape perceptual experiences.
Multisensory perception, learning, and memory
Note the later start time!
Multisensory perception with newly learned sensory skills
What happens to our ability to perceive multisensory information as we age?
Our ability to perceive the world around us can be affected by a number of factors including the nature of the external information, prior experience of the environment, and the integrity of the underlying perceptual system. A particular challenge for the brain is to maintain a coherent perception from information encoded by the peripheral sensory organs whose function is affected by typical, developmental changes across the lifespan. Yet, how the brain adapts to the maturation of the senses, as well as experiential changes in the multisensory environment, is poorly understood. Over the past few years, we have used a range of multisensory tasks to investigate the role of ageing on the brain’s ability to merge sensory inputs. In particular, we have embedded an audio-visual task based on the sound-induced flash illusion (SIFI) into a large-scale, longitudinal study of ageing. Our findings support the idea that the temporal binding window (TBW) is modulated by age and reveal important individual differences in this TBW that may have clinical implications. However, our investigations also suggest the TWB is experience-dependent with evidence for both long and short term behavioural plasticity. An overview of these findings, including recent evidence on how multisensory integration may be associated with higher order functions, will be discussed.
Conflict in Multisensory Perception
Multisensory perception is often studied through the effects of inter-sensory conflict, such as in the McGurk effect, the Ventriloquist illusion, and the Rubber Hand Illusion. Moreover, Bayesian approaches to cue fusion and causal inference overwhelmingly draw on cross-modal conflict to measure and to model multisensory perception. Given the prevalence of conflict, it is remarkable that accounts of multisensory perception have so far neglected the theory of conflict monitoring and cognitive control, established about twenty years ago. I hope to make a case for the role of conflict monitoring and resolution during multisensory perception. To this end, I will present EEG and fMRI data showing that cross-modal conflict in speech, resulting in either integration or segregation, triggers neural mechanisms of conflict detection and resolution. I will also present data supporting a role of these mechanisms during perceptual conflict in general, using Binocular Rivalry, surrealistic imagery, and cinema. Based on this preliminary evidence, I will argue that it is worth considering the potential role of conflict in multisensory perception and its incorporation in a causal inference framework. Finally, I will raise some potential problems associated with this proposal.
Development of multisensory perception and attention and their role in audiovisual speech processing
Plasticity and learning in multisensory perception for action
How multisensory perception is shaped by causal inference and serial effects
Multisensory Perception: Behaviour, Computations and Neural Mechanisms
Our senses are constantly bombarded with a myriad of diverse signals. Transforming this sensory cacophony into a coherent percept of our environment relies on solving two computational challenges: First, we need to solve the causal inference problem - deciding whether signals come from a common cause and thus should be integrated, or come from different sources and be treated independently. Second, when there is a common cause, we should integrate signals across the senses weighted in proportion to their sensory reliabilities. I discuss recent research at the behavioural, computational and neural systems level that investigates how the brain addresses these two computational challenges in multisensory perception.