Auditory Stimuli
auditory stimuli
Multisensory influences on vision: Sounds enhance and alter visual-perceptual processing
Visual perception is traditionally studied in isolation from other sensory systems, and while this approach has been exceptionally successful, in the real world, visual objects are often accompanied by sounds, smells, tactile information, or taste. How is visual processing influenced by these other sensory inputs? In this talk, I will review studies from our lab showing that a sound can influence the perception of a visual object in multiple ways. In the first part, I will focus on spatial interactions between sound and sight, demonstrating that co-localized sounds enhance visual perception. Then, I will show that these cross-modal interactions also occur at a higher contextual and semantic level, where naturalistic sounds facilitate the processing of real-world objects that match these sounds. Throughout my talk I will explore to what extent sounds not only improve visual processing but also alter perceptual representations of the objects we see. Most broadly, I will argue for the importance of considering multisensory influences on visual perception for a more complete understanding of our visual experience.
It’s All About Motion: Functional organization of the multisensory motion system at 7T
The human middle temporal complex (hMT+) has a crucial biological relevance for the processing and detection of direction and speed of motion in visual stimuli. In both humans and monkeys, it has been extensively investigated in terms of its retinotopic properties and selectivity for direction of moving stimuli; however, only in recent years there has been an increasing interest in how neurons in MT encode the speed of motion. In this talk, I will explore the proposed mechanism of speed encoding questioning whether hMT+ neuronal populations encode the stimulus speed directly, or whether they separate motion into its spatial and temporal components. I will characterize how neuronal populations in hMT+ encode the speed of moving visual stimuli using electrocorticography ECoG and 7T fMRI. I will illustrate that the neuronal populations measured in hMT+ are not directly tuned to stimulus speed, but instead encode speed through separate and independent spatial and temporal frequency tuning. Finally, I will suggest that this mechanism may play a role in evaluating multisensory responses for visual, tactile and auditory stimuli in hMT+.
Feeding Exprementation Device ver3 (FED3)
FED3 is a device for behavioral training of mice in vivarium home-cages. Mice interact with FED3 through two nose-pokes and FED3 responds with visual stimuli, auditory stimuli, and by dispensing pellets. As it is used in the home-cage FED3 can be used for around-the-clock training of mice over several weeks. FED3 is open-source and can be built by users for ~10-20x less than commercial solutions for training mice. The control code is also open-source and was designed to be easily modified by users.
Metastable circuit dynamics explains optimal coding of auditory stimuli at moderate arousals
COSYNE 2022
Metastable circuit dynamics explains optimal coding of auditory stimuli at moderate arousals
COSYNE 2022
Auditory stimuli reduce fear responses in a safety learning protocol independent of a possible learning process
FENS Forum 2024
A set of rhythmic features determines the neuronal representation and perception of pulsed auditory stimuli
FENS Forum 2024