← Back

Audiovisual

Topic spotlight
TopicWorld Wide

audiovisual

Discover seminars, jobs, and research tagged with audiovisual across World Wide.
12 curated items7 Seminars5 ePosters
Updated about 2 years ago
12 items · audiovisual
12 results
SeminarNeuroscienceRecording

Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception

Ashley Schormans
BrainsCAN, Western University, Canada.
Sep 26, 2023

To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.

SeminarNeuroscienceRecording

How does seeing help listening? Audiovisual integration in Auditory Cortex

Jennifer Bizley
University College London
Dec 1, 2021

Multisensory responses are ubiquitous in so-called unisensory cortex. However, despite their prevalence, we have very little understanding of what – if anything - they contribute to perception. In this talk I will focus on audio-visual integration in auditory cortex. Anatomical tracing studies highlight visual cortex as one source of visual input to auditory cortex. Using cortical cooling we test the hypothesis that these inputs support audiovisual integration in ferret auditory cortex. Behavioural studies in humans support the idea that visual stimuli can help listeners to parse an auditory scene. This effect is paralleled in single units in auditory cortex, where responses to a sound mixture can be determined by the timing of a visual stimulus such that sounds that are temporally coherent with a visual stimulus are preferentially represented. Our recent data therefore support the idea that one role for the early integration of auditory and visual signals in auditory cortex is to support auditory scene analysis, and that visual cortex plays a key role in this process.

SeminarNeuroscience

Looking and listening while moving

Tom Freeman
Cardiff University
Nov 16, 2021

In this talk I’ll discuss our recent work on how visual and auditory cues to space are integrated as we move. There are at least 3 reasons why this turns out to be a difficult problem for the brain to solve (and us to understand!). First, vision and hearing start off in different coordinates (eye-centred vs head-centred), so they need a common reference frame in which to communicate. By preventing eye and head movements, this problem has been neatly sidestepped in the literature, yet self-movement is the norm. Second, self-movement creates visual and auditory image motion. Correct interpretation therefore requires some form of compensation. Third, vision and hearing encode motion in very different ways: vision contains dedicated motion detectors sensitive to speed, whereas hearing does not. We propose that some (all?) of these problems could be solved by considering the perception of audiovisual space as the integration of separate body-centred visual and auditory cues, the latter formed by integrating image motion with motor system signals and vestibular information. To test this claim, we use a classic cue integration framework, modified to account for cues that are biased and partially correlated. We find good evidence for the model based on simple judgements of audiovisual motion within a circular array of speakers and LEDs that surround the participant while they execute self-controlled head movement.

SeminarNeuroscienceRecording

Perceptual and neural basis of sound-symbolic crossmodal correspondences

Krish Sathian
Penn State Health Milton S. Hershey Medical Center, Pennsylvania State University
Oct 27, 2021
SeminarNeuroscienceRecording

Development of multisensory perception and attention and their role in audiovisual speech processing

David Lewkowicz
Haskins Labs & Yale Child Study Ctr.
Oct 20, 2021
SeminarNeuroscienceRecording

Music training effects on multisensory and cross-sensory transfer processing: from cross-sectional to RCT studies

Karin Petrini
University of Bath
Sep 8, 2021
SeminarNeuroscienceRecording

Brain dynamics underlying memory for continuous natural events

Janice Chen
Johns Hopkins
Aug 20, 2020

The world confronts our senses with a continuous stream of rapidly changing information. Yet, we experience life as a series of episodes or events, and in memory these pieces seem to become even further organized. How do we recall and give structure to this complex information? Recent studies have begun to examine these questions using naturalistic stimuli and behavior: subjects view audiovisual movies and then freely recount aloud their memories of the events. We find brain activity patterns that are unique to individual episodes, and which reappear during verbal recollection; robust generalization of these patterns across people; and memory effects driven by the structure of links between events in a narrative. These findings construct a picture of how we comprehend and recall real-world events that unfold continuously across time.

ePoster

Bayesian integration of audiovisual speech by DNN models is similar to human observers

Haotian Ma, Xiang Zhang, Zhengjia Wang, John F. Magnotti, Michael S. Beauchamp

COSYNE 2025

ePoster

Differential brain processes of newly-learned and overlearned audiovisual associations

Weiyong Xu, Xueqiao Li, Orsolya Kolozsvari, Aino Sorsa, Miriam Nokia, Jarmo Hämäläinen

FENS Forum 2024

ePoster

Dynamic and additive audiovisual integration in mice

George Booth, Timothy Sit, Célian Bimbard, Flóra Takács, Philip Coen, Kenneth Harris, Matteo Carandini

FENS Forum 2024

ePoster

Mesoscale synergy and redundancy in ferret sensory cortices during an audiovisual task

Loren Kocillari, Edgar Galindo-Leon, Florian Pieper, Stefano Panzeri, Andreas Engel

FENS Forum 2024

ePoster

The neural processing of natural audiovisual speech in noise in autism: A TRF approach

Theo Vanneau, Michael Crosse, John Foxe, Sophie Molholm

FENS Forum 2024