Multisensory Processing
multisensory processing
Prof. Edmund Wascher / Dr. Laura-Isabelle Klatt
We are seeking to fill a fully funded PhD position (75% TV-L 13 state employees salary scheme) in cognitive neuroscience. The successful applicant will contribute to a project, investigating selective attention and working memory processes in a multisensory context. In particular, we are interested in how the auditory and the visual system interact during the deployment of attention in multisensory environments and how audio-visual information is integrated. To answer those research questions, we primarily use the EEG in combination with cutting edge analysis methods (e.g., multivariate pattern classification). Beyond that, the application of eye-tracking or (functional) MRI is possible within the project. Your responsibilities will include conducting (EEG-) experiments, data analysis, preparation of manuscripts for publication in peer-reviewed journals, as well as presentation of scientific results at (inter-)national conferences. Official job ad: https://www.ifado.de/ifadoen/careers/current-job-offers/#job3
Time perception in film viewing as a function of film editing
Filmmakers and editors have empirically developed techniques to ensure the spatiotemporal continuity of a film's narration. In terms of time, editing techniques (e.g., elliptical, overlapping, or cut minimization) allow for the manipulation of the perceived duration of events as they unfold on screen. More specifically, a scene can be edited to be time compressed, expanded, or real-time in terms of its perceived duration. Despite the consistent application of these techniques in filmmaking, their perceptual outcomes have not been experimentally validated. Given that viewing a film is experienced as a precise simulation of the physical world, the use of cinematic material to examine aspects of time perception allows for experimentation with high ecological validity, while filmmakers gain more insight on how empirically developed techniques influence viewers' time percept. Here, we investigated how such time manipulation techniques of an action affect a scene's perceived duration. Specifically, we presented videos depicting different actions (e.g., a woman talking on the phone), edited according to the techniques applied for temporal manipulation and asked participants to make verbal estimations of the presented scenes' perceived durations. Analysis of data revealed that the duration of expanded scenes was significantly overestimated as compared to that of compressed and real-time scenes, as was the duration of real-time scenes as compared to that of compressed scenes. Therefore, our results validate the empirical techniques applied for the modulation of a scene's perceived duration. We also found interactions on time estimates of scene type and editing technique as a function of the characteristics and the action of the scene presented. Thus, these findings add to the discussion that the content and characteristics of a scene, along with the editing technique applied, can also modulate perceived duration. Our findings are discussed by considering current timing frameworks, as well as attentional saliency algorithms measuring the visual saliency of the presented stimuli.
Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception
To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.
Multisensory processing of anticipatory and consummatory food cues
Multisensory influences on vision: Sounds enhance and alter visual-perceptual processing
Visual perception is traditionally studied in isolation from other sensory systems, and while this approach has been exceptionally successful, in the real world, visual objects are often accompanied by sounds, smells, tactile information, or taste. How is visual processing influenced by these other sensory inputs? In this talk, I will review studies from our lab showing that a sound can influence the perception of a visual object in multiple ways. In the first part, I will focus on spatial interactions between sound and sight, demonstrating that co-localized sounds enhance visual perception. Then, I will show that these cross-modal interactions also occur at a higher contextual and semantic level, where naturalistic sounds facilitate the processing of real-world objects that match these sounds. Throughout my talk I will explore to what extent sounds not only improve visual processing but also alter perceptual representations of the objects we see. Most broadly, I will argue for the importance of considering multisensory influences on visual perception for a more complete understanding of our visual experience.
NMC4 Short Talk: Neurocomputational mechanisms of causal inference during multisensory processing in the macaque brain
Natural perception relies inherently on inferring causal structure in the environment. However, the neural mechanisms and functional circuits that are essential for representing and updating the hidden causal structure during multisensory processing are unknown. To address this, monkeys were trained to infer the probability of a potential common source from visual and proprioceptive signals on the basis of their spatial disparity in a virtual reality system. The proprioceptive drift reported by monkeys demonstrated that they combined historical information and current multisensory signals to estimate the hidden common source and subsequently updated both the causal structure and sensory representation. Single-unit recordings in premotor and parietal cortices revealed that neural activity in premotor cortex represents the core computation of causal inference, characterizing the estimation and update of the likelihood of integrating multiple sensory inputs at a trial-by-trial level. In response to signals from premotor cortex, neural activity in parietal cortex also represents the causal structure and further dynamically updates the sensory representation to maintain consistency with the causal inference structure. Thus, our results indicate how premotor cortex integrates historical information and sensory inputs to infer hidden variables and selectively updates sensory representations in parietal cortex to support behavior. This dynamic loop of frontal-parietal interactions in the causal inference framework may provide the neural mechanism to answer long-standing questions regarding how neural circuits represent hidden structures for body-awareness and agency.
Seeing with technology: Exchanging the senses with sensory substitution and augmentation
What is perception? Our sensory modalities transmit information about the external world into electrochemical signals that somehow give rise to our conscious experience of our environment. Normally there is too much information to be processed in any given moment, and the mechanisms of attention focus the limited resources of the mind to some information at the expense of others. My research has advanced from first examining visual perception and attention to now examine how multisensory processing contributes to perception and cognition. There are fundamental constraints on how much information can be processed by the different senses on their own and in combination. Here I will explore information processing from the perspective of sensory substitution and augmentation, and how "seeing" with the ears and tongue can advance fundamental and translational research.
Science and technology to understand developmental multisensory processing
Recurrence in temporal multisensory processing
Bernstein Conference 2024