Audiovisual
audiovisual
Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception
To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.
How does seeing help listening? Audiovisual integration in Auditory Cortex
Multisensory responses are ubiquitous in so-called unisensory cortex. However, despite their prevalence, we have very little understanding of what – if anything - they contribute to perception. In this talk I will focus on audio-visual integration in auditory cortex. Anatomical tracing studies highlight visual cortex as one source of visual input to auditory cortex. Using cortical cooling we test the hypothesis that these inputs support audiovisual integration in ferret auditory cortex. Behavioural studies in humans support the idea that visual stimuli can help listeners to parse an auditory scene. This effect is paralleled in single units in auditory cortex, where responses to a sound mixture can be determined by the timing of a visual stimulus such that sounds that are temporally coherent with a visual stimulus are preferentially represented. Our recent data therefore support the idea that one role for the early integration of auditory and visual signals in auditory cortex is to support auditory scene analysis, and that visual cortex plays a key role in this process.
Looking and listening while moving
In this talk I’ll discuss our recent work on how visual and auditory cues to space are integrated as we move. There are at least 3 reasons why this turns out to be a difficult problem for the brain to solve (and us to understand!). First, vision and hearing start off in different coordinates (eye-centred vs head-centred), so they need a common reference frame in which to communicate. By preventing eye and head movements, this problem has been neatly sidestepped in the literature, yet self-movement is the norm. Second, self-movement creates visual and auditory image motion. Correct interpretation therefore requires some form of compensation. Third, vision and hearing encode motion in very different ways: vision contains dedicated motion detectors sensitive to speed, whereas hearing does not. We propose that some (all?) of these problems could be solved by considering the perception of audiovisual space as the integration of separate body-centred visual and auditory cues, the latter formed by integrating image motion with motor system signals and vestibular information. To test this claim, we use a classic cue integration framework, modified to account for cues that are biased and partially correlated. We find good evidence for the model based on simple judgements of audiovisual motion within a circular array of speakers and LEDs that surround the participant while they execute self-controlled head movement.
Perceptual and neural basis of sound-symbolic crossmodal correspondences
Development of multisensory perception and attention and their role in audiovisual speech processing
Music training effects on multisensory and cross-sensory transfer processing: from cross-sectional to RCT studies
Brain dynamics underlying memory for continuous natural events
The world confronts our senses with a continuous stream of rapidly changing information. Yet, we experience life as a series of episodes or events, and in memory these pieces seem to become even further organized. How do we recall and give structure to this complex information? Recent studies have begun to examine these questions using naturalistic stimuli and behavior: subjects view audiovisual movies and then freely recount aloud their memories of the events. We find brain activity patterns that are unique to individual episodes, and which reappear during verbal recollection; robust generalization of these patterns across people; and memory effects driven by the structure of links between events in a narrative. These findings construct a picture of how we comprehend and recall real-world events that unfold continuously across time.
Bayesian integration of audiovisual speech by DNN models is similar to human observers
COSYNE 2025
Differential brain processes of newly-learned and overlearned audiovisual associations
FENS Forum 2024
Dynamic and additive audiovisual integration in mice
FENS Forum 2024
Mesoscale synergy and redundancy in ferret sensory cortices during an audiovisual task
FENS Forum 2024
The neural processing of natural audiovisual speech in noise in autism: A TRF approach
FENS Forum 2024