← Back

Auditory System

Topic spotlight
TopicWorld Wide

auditory system

Discover seminars, jobs, and research tagged with auditory system across World Wide.
13 curated items9 Seminars4 ePosters
Updated about 2 months ago
13 items · auditory system
13 results
SeminarNeuroscience

Competing Rhythms: Understanding and Modulating Auditory Neural Entrainment

Dr. Yuranny Cabral-Calderin
Freie Universität Berlin, Germany
Oct 7, 2025
SeminarNeuroscienceRecording

Space and its computational challenges

Jennifer Groh
Duke University
Nov 17, 2021

How our senses work both separately and together involves rich computational problems. I will discuss the spatial and representational problems faced by the visual and auditory system, focusing on two issues. 1. How does the brain correct for discrepancies in the visual and auditory spatial reference frames? I will describe our recent discovery of a novel type of otoacoustic emission, the eye movement related eardrum oscillation, or EMREO (Gruters et al, PNAS 2018). 2. How does the brain encode more than one stimulus at a time? I will discuss evidence for neural time-division multiplexing, in which neural activity fluctuates across time to allow representations to encode more than one simultaneous stimulus (Caruso et al, Nat Comm 2018). These findings all emerged from experimentally testing computational models regarding spatial representations and their transformations within and across sensory pathways. Further, they speak to several general problems confronting modern neuroscience such as the hierarchical organization of brain pathways and limits on perceptual/cognitive processing.

SeminarNeuroscienceRecording

Encoding and perceiving the texture of sounds: auditory midbrain codes for recognizing and categorizing auditory texture and for listening in noise

Monty Escabi
University of Connecticut
Sep 30, 2021

Natural soundscapes such as from a forest, a busy restaurant, or a busy intersection are generally composed of a cacophony of sounds that the brain needs to interpret either independently or collectively. In certain instances sounds - such as from moving cars, sirens, and people talking - are perceived in unison and are recognized collectively as single sound (e.g., city noise). In other instances, such as for the cocktail party problem, multiple sounds compete for attention so that the surrounding background noise (e.g., speech babble) interferes with the perception of a single sound source (e.g., a single talker). I will describe results from my lab on the perception and neural representation of auditory textures. Textures, such as a from a babbling brook, restaurant noise, or speech babble are stationary sounds consisting of multiple independent sound sources that can be quantitatively defined by summary statistics of an auditory model (McDermott & Simoncelli 2011). How and where in the auditory system are summary statistics represented and the neural codes that potentially contribute towards their perception, however, are largely unknown. Using high-density multi-channel recordings from the auditory midbrain of unanesthetized rabbits and complementary perceptual studies on human listeners, I will first describe neural and perceptual strategies for encoding and perceiving auditory textures. I will demonstrate how distinct statistics of sounds, including the sound spectrum and high-order statistics related to the temporal and spectral correlation structure of sounds, contribute to texture perception and are reflected in neural activity. Using decoding methods I will then demonstrate how various low and high-order neural response statistics can differentially contribute towards a variety of auditory tasks including texture recognition, discrimination, and categorization. Finally, I will show examples from our recent studies on how high-order sound statistics and accompanying neural activity underlie difficulties for recognizing speech in background noise.

SeminarNeuroscienceRecording

Direction selectivity in hearing: monaural phase sensitivity in octopus neurons

Philip Joris
KU Leuven
May 16, 2021

The processing of temporal sound features is fundamental to hearing, and the auditory system displays a plethora of specializations, at many levels, to enable such processing. Octopus neurons are the most extreme temporally-specialized cells in the auditory (and perhaps entire) brain, which make them intriguing but also difficult to study. Notwithstanding the scant physiological data, these neurons have been a favorite cell type of modeling studies which have proposed that octopus cells have critical roles in pitch and speech perception. We used a range of in vivo recording and labeling methods to examine the hypothesis that tonotopic ordering of cochlear afferents combines with dendritic delays to compensate for cochlear delay - which would explain the highly entrained responses of octopus cells to sound transients. Unexpectedly, the experiments revealed that these neurons have marked selectivity to the direction of fast frequency glides, which is tied in a surprising way to intrinsic membrane properties and subthreshold events. The data suggest that octopus cells have a role in temporal comparisons across frequency and may play a role in auditory scene analysis.

SeminarNeuroscienceRecording

Distinct forms of cortical plasticity underlie difficulties to reliably detect sounds in noisy environments"; "Acoustic context modulates natural sound discrimination in auditory cortex through frequency specific adaptation

Dr. Jennifer Resnik; Dr. Julio Hechavarria
Ben-Gurion University; Goethe University
Feb 22, 2021
SeminarNeuroscience

Critical periods for plasticity in the developing auditory system

Tania Barkat
Basel University, Switzerland
Jan 17, 2021
SeminarNeuroscience

Recurrent corticothalamic feedback in the auditory system: perceptual salience and dopaminergic modulation

Max Happel
Leibniz Institute for Neurobiology, Magdeburg, Germany
Oct 4, 2020
SeminarNeuroscience

Neural coding in the auditory cortex - "Emergent Scientists Seminar Series

Dr Jennifer Lawlor & Mr Aleksandar Ivanov
Johns Hopkins University / University of Oxford
Jul 16, 2020

Dr Jennifer Lawlor Title: Tracking changes in complex auditory scenes along the cortical pathway Complex acoustic environments, such as a busy street, are characterised by their everchanging dynamics. Despite their complexity, listeners can readily tease apart relevant changes from irrelevant variations. This requires continuously tracking the appropriate sensory evidence while discarding noisy acoustic variations. Despite the apparent simplicity of this perceptual phenomenon, the neural basis of the extraction of relevant information in complex continuous streams for goal-directed behavior is currently not well understood. As a minimalistic model for change detection in complex auditory environments, we designed broad-range tone clouds whose first-order statistics change at a random time. Subjects (humans or ferrets) were trained to detect these changes.They were faced with the dual-task of estimating the baseline statistics and detecting a potential change in those statistics at any moment. To characterize the extraction and encoding of relevant sensory information along the cortical hierarchy, we first recorded the brain electrical activity of human subjects engaged in this task using electroencephalography. Human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. To further this investigation, we performed a series of electrophysiological recordings in the primary auditory cortex (A1), secondary auditory cortex (PEG) and frontal cortex (FC) of the fully trained behaving ferret. A1 neurons exhibited strong onset responses and change-related discharges specific to neuronal tuning. PEG population showed reduced onset-related responses, but more categorical change-related modulations. Finally, a subset of FC neurons (dlPFC/premotor) presented a generalized response to all change-related events only during behavior. We show using a Generalized Linear Model (GLM) that the same subpopulation in FC encodes sensory and decision signals, suggesting that FC neurons could operate conversion of sensory evidence to perceptual decision. All together, these area-specific responses suggest a behavior-dependent mechanism of sensory extraction and generalization of task-relevant event. Aleksandar Ivanov Title: How does the auditory system adapt to different environments: A song of echoes and adaptation

ePoster

Transformation of population representations of sounds throughout the auditory system

COSYNE 2022

ePoster

Transformation of population representations of sounds throughout the auditory system

COSYNE 2022

ePoster

Predictive dynamics improve noise robustness in a deep network model of the human auditory system

Ching Fang, Erica Shook, Justin Buck, Guillermo Horga

COSYNE 2023

ePoster

Describing neural encoding from large-scale brain recordings: A deep learning model of the central auditory system

Fotios Drakopoulos, Yiqing Xia, Andreas Fragner, Nicholas A Lesica

FENS Forum 2024