Resources
Authors & Affiliations
Alessandro La Chioma, David Schneider
Abstract
Auditory perception relies on predicting the acoustic consequences of our actions. The auditory cortex (AC) responds differently to expected versus unexpected self-generated sounds. Yet it remains untested whether AC dynamically updates predictions about self-generated sounds in a context-dependent manner.We developed a naturalistic audio-visual-motor virtual reality (VR) system for head-fixed mice. Real-time locomotion tracking was performed to provide artificial footstep sounds that were yoked to a precise phase of the step cycle, creating an ethological and experimentally manipulable form of auditory reafference. While running on the treadmill, mice repeatedly traversed two different contextual environments, each consisting of a distinct visual corridor accompanied by distinct footstep sounds. Using this system, we asked whether AC neural activity reflects predictions about the sound that footsteps are expected to produce in a given context, and to what extent contextual and motor signals integrate with auditory information.Following behavioral acclimation, we made high-density neuronal recordings from primary AC as mice traversed the two VR contexts and experienced expected or deviant footsteps. We observed overall suppression of neural responses to self-generated sounds compared to the same sounds heard passively. Subsets of neurons responded differently to the same sound heard in the expected versus the unexpected context. These expectation violation-like signals emerge immediately after entering a new context, suggesting a rapid updating of predictions. Population-level analysis indicates that context information is embedded in AC population activity. Our results suggest that AC combines auditory and motor signals with visual cues for context-dependent processing of self-generated sounds.