auditory cortex
Latest
Dr. Michele Insanally
The Insanally Lab is hiring postdocs to study the neural basis of auditory perception and learning. We incorporate a wide range of techniques including behavioral paradigms, in vivo multi-region neural recordings, optogenetics, chemogenetics, fiber photometry, and novel computational methods. Our lab is super supportive, collaborative, and we take mentoring seriously! Located at Pitt, our lab is part of a large systems neuroscience community that includes CNBC and CMU. For inquiries, feel free to reach out to me here: mni@pitt.edu. To find out more about our work, visit Insanallylab.com
Neural mechanisms of optimal performance
When we attend a demanding task, our performance is poor at low arousal (when drowsy) or high arousal (when anxious), but we achieve optimal performance at intermediate arousal. This celebrated Yerkes-Dodson inverted-U law relating performance and arousal is colloquially referred to as being "in the zone." In this talk, I will elucidate the behavioral and neural mechanisms linking arousal and performance under the Yerkes-Dodson law in a mouse model. During decision-making tasks, mice express an array of discrete strategies, whereby the optimal strategy occurs at intermediate arousal, measured by pupil, consistent with the inverted-U law. Population recordings from the auditory cortex (A1) further revealed that sound encoding is optimal at intermediate arousal. To explain the computational principle underlying this inverted-U law, we modeled the A1 circuit as a spiking network with excitatory/inhibitory clusters, based on the observed functional clusters in A1. Arousal induced a transition from a multi-attractor (low arousal) to a single attractor phase (high arousal), and performance is optimized at the transition point. The model also predicts stimulus- and arousal-induced modulations of neural variability, which we confirmed in the data. Our theory suggests that a single unifying dynamical principle, phase transitions in metastable dynamics, underlies both the inverted-U law of optimal performance and state-dependent modulations of neural variability.
The representation of speech conversations in the human auditory cortex
Internal representation of musical rhythm: transformation from sound to periodic beat
When listening to music, humans readily perceive and move along with a periodic beat. Critically, perception of a periodic beat is commonly elicited by rhythmic stimuli with physical features arranged in a way that is not strictly periodic. Hence, beat perception must capitalize on mechanisms that transform stimulus features into a temporally recurrent format with emphasized beat periodicity. Here, I will present a line of work that aims to clarify the nature and neural basis of this transformation. In these studies, electrophysiological activity was recorded as participants listened to rhythms known to induce perception of a consistent beat across healthy Western adults. The results show that the human brain selectively emphasizes beat representation when it is not acoustically prominent in the stimulus, and this transformation (i) can be captured non-invasively using surface EEG in adult participants, (ii) is already in place in 5- to 6-month-old infants, and (iii) cannot be fully explained by subcortical auditory nonlinearities. Moreover, as revealed by human intracerebral recordings, a prominent beat representation emerges already in the primary auditory cortex. Finally, electrophysiological recordings from the auditory cortex of a rhesus monkey show a significant enhancement of beat periodicities in this area, similar to humans. Taken together, these findings indicate an early, general auditory cortical stage of processing by which rhythmic inputs are rendered more temporally recurrent than they are in reality. Already present in non-human primates and human infants, this "periodized" default format could then be shaped by higher-level associative sensory-motor areas and guide movement in individuals with strongly coupled auditory and motor systems. Together, this highlights the multiplicity of neural processes supporting coordinated musical behaviors widely observed across human cultures.The experiments herein include: a motor timing task comparing the effects of movement vs non-movement with and without feedback (Exp. 1A & 1B), a transcranial magnetic stimulation (TMS) study on the role of the supplementary motor area (SMA) in transforming temporal information (Exp. 2), and a perceptual timing task investigating the effect of noisy movement on time perception with both visual and auditory modalities (Exp. 3A & 3B). Together, the results of these studies support the Bayesian cue combination framework, in that: movement improves the precision of time perception not only in perceptual timing tasks but also motor timing tasks (Exp. 1A & 1B), stimulating the SMA appears to disrupt the transformation of temporal information (Exp. 2), and when movement becomes unreliable or noisy there is no longer an improvement in precision of time perception (Exp. 3A & 3B). Although there is support for the proposed framework, more studies (i.e., fMRI, TMS, EEG, etc.) need to be conducted in order to better understand where and how this may be instantiated in the brain; however, this work provides a starting point to better understanding the intrinsic connection between time and movement
Unravelling bistable perception from human intracranial recordings
Discovering dynamical patterns from high fidelity timeseries is typically a challenging task. In this talk, the timeseries data consist of neural recordings taken from the auditory cortex of human subjects who listened to sequences of repeated triplets of tones and reported their perception by pressing a button. Subjects reported spontaneous alternations between two auditory perceptual states (1-stream and 2-streams). We discuss a data-driven method, which leverages time-delayed coordinates, diffusion maps, and dynamic mode decomposition, to identify neural features that correlated with subject-reported switching between perceptual states.
Hearing in an acoustically varied world
In order for animals to thrive in their complex environments, their sensory systems must form representations of objects that are invariant to changes in some dimensions of their physical cues. For example, we can recognize a friend’s speech in a forest, a small office, and a cathedral, even though the sound reaching our ears will be very different in these three environments. I will discuss our recent experiments into how neurons in auditory cortex can form stable representations of sounds in this acoustically varied world. We began by using a normative computational model of hearing to examine how the brain may recognize a sound source across rooms with different levels of reverberation. The model predicted that reverberations can be removed from the original sound by delaying the inhibitory component of spectrotemporal receptive fields in the presence of stronger reverberation. Our electrophysiological recordings then confirmed that neurons in ferret auditory cortex apply this algorithm to adapt to different room sizes. Our results demonstrate that this neural process is dynamic and adaptive. These studies provide new insights into how we can recognize auditory objects even in highly reverberant environments, and direct further research questions about how reverb adaptation is implemented in the cortical circuit.
How does seeing help listening? Audiovisual integration in Auditory Cortex
Multisensory responses are ubiquitous in so-called unisensory cortex. However, despite their prevalence, we have very little understanding of what – if anything - they contribute to perception. In this talk I will focus on audio-visual integration in auditory cortex. Anatomical tracing studies highlight visual cortex as one source of visual input to auditory cortex. Using cortical cooling we test the hypothesis that these inputs support audiovisual integration in ferret auditory cortex. Behavioural studies in humans support the idea that visual stimuli can help listeners to parse an auditory scene. This effect is paralleled in single units in auditory cortex, where responses to a sound mixture can be determined by the timing of a visual stimulus such that sounds that are temporally coherent with a visual stimulus are preferentially represented. Our recent data therefore support the idea that one role for the early integration of auditory and visual signals in auditory cortex is to support auditory scene analysis, and that visual cortex plays a key role in this process.
Through the bottleneck: my adventures with the 'Tishby program'
One of Tali's cherished goals was to transform biology into physics. In his view, biologists were far too enamored by the details of the specific models they studied, losing sight of the big principles that may govern the behavior of these models. One such big principle that he suggested was the 'information bottleneck (IB) principle'. The iIB principle is an information-theoretical approach for extracting the relevant information that one random variable carries about another. Tali applied the IB principle to numerous problems in biology, gaining important insights in the process. Here I will describe two applications of the IB principle to neurobiological data. The first is the formalization of the notion of surprise that allowed us to rigorously estimate the memory duration and content of neuronal responses in auditory cortex, and the second is an application to behavior, allowing us to estimate 'optimal policies under information constraints' that shed interesting light on rat behavior.
What is the function of auditory cortex when it develops in the absence of acoustic input?
Cortical plasticity is the neural mechanism by which the cerebrum adapts itself to its environment, while at the same time making it vulnerable to impoverished sensory or developmental experiences. Like the visual system, auditory development passes through a series of sensitive periods in which circuits and connections are established and then refined by experience. Current research is expanding our understanding of cerebral processing and organization in the deaf. In the congenitally deaf, higher-order areas of "deaf" auditory cortex demonstrate significant crossmodal plasticity with neurons responding to visual and somatosensory stimuli. This crucial cerebral function results in compensatory plasticity. Not only can the remaining inputs reorganize to substitute for those lost, but this additional circuitry also confers enhanced abilities to the remaining systems. In this presentation we will review our present understanding of the structure and function of “deaf” auditory cortex using psychophysical, electrophysiological, and connectional anatomy approaches and consider how this knowledge informs our expectations of the capabilities of cochlear implants in the developing brain.
Encoding and perceiving the texture of sounds: auditory midbrain codes for recognizing and categorizing auditory texture and for listening in noise
Natural soundscapes such as from a forest, a busy restaurant, or a busy intersection are generally composed of a cacophony of sounds that the brain needs to interpret either independently or collectively. In certain instances sounds - such as from moving cars, sirens, and people talking - are perceived in unison and are recognized collectively as single sound (e.g., city noise). In other instances, such as for the cocktail party problem, multiple sounds compete for attention so that the surrounding background noise (e.g., speech babble) interferes with the perception of a single sound source (e.g., a single talker). I will describe results from my lab on the perception and neural representation of auditory textures. Textures, such as a from a babbling brook, restaurant noise, or speech babble are stationary sounds consisting of multiple independent sound sources that can be quantitatively defined by summary statistics of an auditory model (McDermott & Simoncelli 2011). How and where in the auditory system are summary statistics represented and the neural codes that potentially contribute towards their perception, however, are largely unknown. Using high-density multi-channel recordings from the auditory midbrain of unanesthetized rabbits and complementary perceptual studies on human listeners, I will first describe neural and perceptual strategies for encoding and perceiving auditory textures. I will demonstrate how distinct statistics of sounds, including the sound spectrum and high-order statistics related to the temporal and spectral correlation structure of sounds, contribute to texture perception and are reflected in neural activity. Using decoding methods I will then demonstrate how various low and high-order neural response statistics can differentially contribute towards a variety of auditory tasks including texture recognition, discrimination, and categorization. Finally, I will show examples from our recent studies on how high-order sound statistics and accompanying neural activity underlie difficulties for recognizing speech in background noise.
Expectation of self-generated sounds drives predictive processing in mouse auditory cortex
Sensory stimuli are often predictable consequences of one’s actions, and behavior exerts a correspondingly strong influence over sensory responses in the brain. Closed-loop experiments with the ability to control the sensory outcomes of specific animal behaviors have revealed that neural responses to self-generated sounds are suppressed in the auditory cortex, suggesting a role for prediction in local sensory processing. However, it is unclear whether this phenomenon derives from a precise movement-based prediction or how it affects the neural representation of incoming stimuli. We address these questions by designing a behavioral paradigm where mice learn to expect the predictable acoustic consequences of a simple forelimb movement. Neuronal recordings from auditory cortex revealed suppression of neural responses that was strongest for the expected tone and specific to the time of the sound-associated movement. Predictive suppression in the auditory cortex was layer-specific, preceded by the arrival of movement information, and unaffected by behavioral relevance or reward association. These findings illustrate that expectation, learned through motor-sensory experience, drives layer-specific predictive processing in the mouse auditory cortex.
Dynamics of the mouse auditory cortex and the perception of sound
Reflections of action, expectation, and experience in mouse auditory cortex
A Cortical Circuit for Audio-Visual Predictions
Team work makes sensory streams work: our senses work together, learn from each other, and stand in for one another, the result of which is perception and understanding. Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli (Mcgurk and Macdonald, 1976). During audio-visual associative learning, auditory cortex is thought to underlie multi-modal plasticity in visual cortex (McIntosh et al., 1998; Mishra et al., 2007; Zangenehpour and Zatorre, 2010). However, it is not well understood how processing in visual cortex is altered by an auditory stimulus that is predictive of a visual stimulus and what the mechanisms are that mediate such experience-dependent, audio-visual associations in sensory cortex. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to an experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically-matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be stored in long-range cortical connections and that with learning these cross-modal connections function to suppress the responses to predictable input.
Distinct forms of cortical plasticity underlie difficulties to reliably detect sounds in noisy environments"; "Acoustic context modulates natural sound discrimination in auditory cortex through frequency specific adaptation
The precision of prediction errors in the auditory cortex
Distinct synaptic plasticity mechanisms determine the diversity of cortical responses during behavior
Spike trains recorded from the cortex of behaving animals can be complex, highly variable from trial to trial, and therefore challenging to interpret. A fraction of cells exhibit trial-averaged responses with obvious task-related features such as pure tone frequency tuning in auditory cortex. However, a substantial number of cells (including cells in primary sensory cortex) do not appear to fire in a task-related manner and are often neglected from analysis. We recently used a novel single-trial, spike-timing-based analysis to show that both classically responsive and non-classically responsive cortical neurons contain significant information about sensory stimuli and behavioral decisions suggesting that non-classically responsive cells may play an underappreciated role in perception and behavior. We now expand this investigation to explore the synaptic origins and potential contribution of these cells to network function. To do so, we trained a novel spiking recurrent neural network model that incorporates spike-timing-dependent plasticity (STDP) mechanisms to perform the same task as behaving animals. By leveraging excitatory and inhibitory plasticity rules this model reproduces neurons with response profiles that are consistent with previously published experimental data, including classically responsive and non-classically responsive neurons. We found that both classically responsive and non-classically responsive neurons encode behavioral variables in their spike times as seen in vivo. Interestingly, plasticity in excitatory-to-excitatory synapses increased the proportion of non-classically responsive neurons and may play a significant role in determining response profiles. Finally, our model also makes predictions about the synaptic origins of classically and non-classically responsive neurons which we can compare to in vivo whole-cell recordings taken from the auditory cortex of behaving animals. This approach successfully recapitulates heterogeneous response profiles measured from behaving animals and provides a powerful lens for exploring large-scale neuronal dynamics and the plasticity rules that shape them.
Targeting Neural Plasticity by Optogenetic Silencing in the Auditory Cortex
Plasticity in hypothalamic circuits for oxytocin release
Mammalian babies are “sensory traps” for parents. Various sensory cues from the newborn are tremendously efficient in triggering parental responses in caregivers. We recently showed that core aspects of maternal behavior such as pup retrieval in response to infant vocalizations rely on active learning of auditory cues from pups facilitated by the neurohormone oxytocin (OT). Release of OT from the hypothalamus might thus help induce recognition of different infant cues but it is unknown what sensory stimuli can activate OT neurons. I performed unprecedented in vivo whole-cell and cell-attached recordings from optically-identified OT neurons in awake dams. I found that OT neurons, but not other hypothalamic cells, increased their firing rate after playback of pup distress vocalizations. Using anatomical tracing approaches and channelrhodopsin-assisted circuit mapping, I identified the projections and brain areas (including inferior colliculus, auditory cortex, and posterior intralaminar thalamus) relaying auditory information about social sounds to OT neurons. In hypothalamic brain slices, when optogenetically stimulating thalamic afferences to mimic high-frequency thalamic discharge, observed in vivo during pup calls playback, I found that thalamic activity led to long-term depression of synaptic inhibition in OT neurons. This was mediated by postsynaptic NMDARs-induced internalization of GABAARs. Therefore, persistent activation of OT neurons following pup calls in vivo is likely mediated by disinhibition. This gain modulation of OT neurons by infant cries, may be important for sustaining motivation. Using a genetically-encoded OT sensor, I demonstrated that pup calls were efficient in triggering OT release in downstream motivational areas. When thalamus projections to hypothalamus were inhibited with chemogenetics, dams exhibited longer latencies to retrieve crying pups, suggesting that the thalamus-hypothalamus noncanonical auditory pathway may be a specific circuit for the detection of social sounds, important for disinhibiting OT neurons, gating OT release in downstream brain areas, and speeding up maternal behavior.
Neural coding in the auditory cortex - "Emergent Scientists Seminar Series
Dr Jennifer Lawlor Title: Tracking changes in complex auditory scenes along the cortical pathway Complex acoustic environments, such as a busy street, are characterised by their everchanging dynamics. Despite their complexity, listeners can readily tease apart relevant changes from irrelevant variations. This requires continuously tracking the appropriate sensory evidence while discarding noisy acoustic variations. Despite the apparent simplicity of this perceptual phenomenon, the neural basis of the extraction of relevant information in complex continuous streams for goal-directed behavior is currently not well understood. As a minimalistic model for change detection in complex auditory environments, we designed broad-range tone clouds whose first-order statistics change at a random time. Subjects (humans or ferrets) were trained to detect these changes.They were faced with the dual-task of estimating the baseline statistics and detecting a potential change in those statistics at any moment. To characterize the extraction and encoding of relevant sensory information along the cortical hierarchy, we first recorded the brain electrical activity of human subjects engaged in this task using electroencephalography. Human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. To further this investigation, we performed a series of electrophysiological recordings in the primary auditory cortex (A1), secondary auditory cortex (PEG) and frontal cortex (FC) of the fully trained behaving ferret. A1 neurons exhibited strong onset responses and change-related discharges specific to neuronal tuning. PEG population showed reduced onset-related responses, but more categorical change-related modulations. Finally, a subset of FC neurons (dlPFC/premotor) presented a generalized response to all change-related events only during behavior. We show using a Generalized Linear Model (GLM) that the same subpopulation in FC encodes sensory and decision signals, suggesting that FC neurons could operate conversion of sensory evidence to perceptual decision. All together, these area-specific responses suggest a behavior-dependent mechanism of sensory extraction and generalization of task-relevant event. Aleksandar Ivanov Title: How does the auditory system adapt to different environments: A song of echoes and adaptation
Auditory cortex represents an abstract sensorimotor rule
COSYNE 2022
Clear evidence in favor of adaptation and against temporally specific predictive suppression in monkey primary auditory cortex
COSYNE 2022
Many, but not all, deep neural network audio models predict auditory cortex responses and exhibit hierarchical layer-region correspondence
COSYNE 2022
Mechanisms of plasticity for pup call sounds in the maternal auditory cortex
COSYNE 2022
Many, but not all, deep neural network audio models predict auditory cortex responses and exhibit hierarchical layer-region correspondence
COSYNE 2022
Mechanisms of plasticity for pup call sounds in the maternal auditory cortex
COSYNE 2022
Predictive coding of global sequence violation in the mouse auditory cortex
COSYNE 2022
Predictive coding of global sequence violation in the mouse auditory cortex
COSYNE 2022
Simultaneous mnemonic and predictive representations in the auditory cortex
COSYNE 2022
Simultaneous mnemonic and predictive representations in the auditory cortex
COSYNE 2022
Synaptic and mesoscale plasticity in auditory cortex of rats with cochlear implants
COSYNE 2022
Synaptic and mesoscale plasticity in auditory cortex of rats with cochlear implants
COSYNE 2022
Neural mechanisms of stream formation during active listening in the ferret auditory cortex
COSYNE 2023
Understanding Auditory Cortex with Deep Neural Networks
COSYNE 2023
Convolutional neural networks describe encoding subspaces of local circuits in auditory cortex
COSYNE 2025
Envelope representations substantially enhance the predictive power of spectrotemporal receptive models in the human auditory cortex
COSYNE 2025
Instinct vs Insight: Neural Competition Between Prefrontal and Auditory Cortex Constrains Sound Strategy Learning
COSYNE 2025
Time-yoked integration throughout human auditory cortex
COSYNE 2025
Auditory cortex control of vocalization
FENS Forum 2024
Auditory cortex activity during sound memory retention in an auditory working memory task
FENS Forum 2024
Conservation of sensory coding in the auditory cortex of mice between wakefulness and sleep
FENS Forum 2024
Deciphering internal processing states in the auditory cortex through dynamic interplay of evoked and spontaneous population activity
FENS Forum 2024
Estrogenic regulation of the mouse auditory cortex
FENS Forum 2024
High resolution auditory percept through soft auditory cortex implant in macaques
FENS Forum 2024
Homeostasis of a representational map in the mouse auditory cortex
FENS Forum 2024
Imaging collective synaptic dynamics in the mouse auditory cortex during learning
FENS Forum 2024
Multimodal activity of mouse auditory cortex during audio-visual-motor virtual reality
FENS Forum 2024
Multisession electric stimulation of the auditory cortex prevents cortical aging in an age-related hearing loss Wistar rat model
FENS Forum 2024
Neural dynamics and representational drift of inhibitory neurons in mouse auditory cortex
FENS Forum 2024
Replay of letter strings by single neurons in medial temporal lobe and auditory cortex EEG during verbal working memory maintenance
FENS Forum 2024
Spatial coding plasticity in the auditory cortex during sound localization behavior
FENS Forum 2024
Statistical learning in auditory cortex and hippocampus
FENS Forum 2024
Population encoding of meaning in the avian auditory cortex
Neuromatch 5
auditory cortex coverage
53 items