Auditory Cues
auditory cues
Looking and listening while moving
In this talk I’ll discuss our recent work on how visual and auditory cues to space are integrated as we move. There are at least 3 reasons why this turns out to be a difficult problem for the brain to solve (and us to understand!). First, vision and hearing start off in different coordinates (eye-centred vs head-centred), so they need a common reference frame in which to communicate. By preventing eye and head movements, this problem has been neatly sidestepped in the literature, yet self-movement is the norm. Second, self-movement creates visual and auditory image motion. Correct interpretation therefore requires some form of compensation. Third, vision and hearing encode motion in very different ways: vision contains dedicated motion detectors sensitive to speed, whereas hearing does not. We propose that some (all?) of these problems could be solved by considering the perception of audiovisual space as the integration of separate body-centred visual and auditory cues, the latter formed by integrating image motion with motor system signals and vestibular information. To test this claim, we use a classic cue integration framework, modified to account for cues that are biased and partially correlated. We find good evidence for the model based on simple judgements of audiovisual motion within a circular array of speakers and LEDs that surround the participant while they execute self-controlled head movement.
Neural Mechanisms of Coordination in Duetting Wrens
To communicate effectively, two individuals must take turns to prevent overlap in their signals. How does the nervous system coordinate vocalizations between two individuals? Female and male plain-tailed wrens sing a duet in which they alternate syllable production so rapidly and precisely it sounds as if a single bird is singing. I will talk about experiments that examine the interaction between sensory cues and motor activity, using behavioral manipulations and neurophysiological recordings from pairs of awake, duetting wrens. I will show evidence that auditory cues link the brains of the wrens by modulating motor circuits.
Plasticity in hypothalamic circuits for oxytocin release
Mammalian babies are “sensory traps” for parents. Various sensory cues from the newborn are tremendously efficient in triggering parental responses in caregivers. We recently showed that core aspects of maternal behavior such as pup retrieval in response to infant vocalizations rely on active learning of auditory cues from pups facilitated by the neurohormone oxytocin (OT). Release of OT from the hypothalamus might thus help induce recognition of different infant cues but it is unknown what sensory stimuli can activate OT neurons. I performed unprecedented in vivo whole-cell and cell-attached recordings from optically-identified OT neurons in awake dams. I found that OT neurons, but not other hypothalamic cells, increased their firing rate after playback of pup distress vocalizations. Using anatomical tracing approaches and channelrhodopsin-assisted circuit mapping, I identified the projections and brain areas (including inferior colliculus, auditory cortex, and posterior intralaminar thalamus) relaying auditory information about social sounds to OT neurons. In hypothalamic brain slices, when optogenetically stimulating thalamic afferences to mimic high-frequency thalamic discharge, observed in vivo during pup calls playback, I found that thalamic activity led to long-term depression of synaptic inhibition in OT neurons. This was mediated by postsynaptic NMDARs-induced internalization of GABAARs. Therefore, persistent activation of OT neurons following pup calls in vivo is likely mediated by disinhibition. This gain modulation of OT neurons by infant cries, may be important for sustaining motivation. Using a genetically-encoded OT sensor, I demonstrated that pup calls were efficient in triggering OT release in downstream motivational areas. When thalamus projections to hypothalamus were inhibited with chemogenetics, dams exhibited longer latencies to retrieve crying pups, suggesting that the thalamus-hypothalamus noncanonical auditory pathway may be a specific circuit for the detection of social sounds, important for disinhibiting OT neurons, gating OT release in downstream brain areas, and speeding up maternal behavior.
Motor Cortical Control of Vocal Interactions in a Neotropical Singing Mouse
Using sounds for social interactions is common across many taxa. Humans engaged in conversation, for example, take rapid turns to go back and forth. This ability to act upon sensory information to generate a desired motor output is a fundamental feature of animal behavior. How the brain enables such flexible sensorimotor transformations, for example during vocal interactions, is a central question in neuroscience. Seeking a rodent model to fill this niche, we are investigating neural mechanisms of vocal interaction in Alston’s singing mouse (Scotinomys teguina) – a neotropical rodent native to the cloud forests of Central America. We discovered sub-second temporal coordination of advertisement songs (counter-singing) between males of this species – a behavior that requires the rapid modification of motor outputs in response to auditory cues. We leveraged this natural behavior to probe the neural mechanisms that generate and allow fast and flexible vocal communication. Using causal manipulations, we recently showed that an orofacial motor cortical area (OMC) in this rodent is required for vocal interactions (Okobi*, Banerjee* et. al, 2019). Subsequently, in electrophysiological recordings, I find neurons in OMC that track initiation, termination and relative timing of songs. Interestingly, persistent neural dynamics during song progression stretches or compresses on every trial to match the total song duration (Banerjee et al, in preparation). These results demonstrate robust cortical control of vocal timing in a rodent and upends the current dogma that motor cortical control of vocal output is evolutionarily restricted to the primate lineage.
Hippocampal place cells can map space using distal auditory cues
FENS Forum 2024