Resources
Authors & Affiliations
Loredana Stoica Ghita, Lukas Hillisch, Jonathan R. Whitlock
Abstract
The mammalian brain continuously integrates sensory and proprioceptive information. This dynamic process enables adaptation to rapidly changing environments and is critical for perception and action selection. However, the exact underlying neuronal computations remain poorly understood, particularly in freely moving animals where behavioral and sensorial transitions are rapid. Relating neural activity to natural behavior has been technically challenging due to a lack of tools to quantify postural dynamics at ethologically relevant timescales. Furthermore, sensory systems have traditionally been studied in anesthetized or head-fixed subjects performing passive perceptual tasks. These limitations will be overcome using a multidisciplinary approach combining newly developed facial tracking, 3D motion capture and Neuropixels recordings in freely behaving rats. By synchronizing visual stimuli, momentary kinematics and neuronal activity, we will investigate how a primary sensory cortical area (V1) represents visual and postural information against the backdrop of natural behavior. In the paradigm, rats explore an enclosed dome where a 360° projection of incoming visual signals is tightly controlled, including closed-loop and open-loop stimulus presentation coupled with animal movement. In preliminary experiments, we measure pupil dynamics during different visual stimulation conditions, using pupil fluctuations at transitions between conditions as a psychometric proxy for surprise. We will next integrate interleaved visual stimulation sequences with electrophysiological recordings to identify neuronal signatures during visual transition states. These results will ultimately offer insights into how dynamic behavior modulates sensory processing in unpredictable circumstances, and identify basic principles by which cognitive processes and behavior are implemented at circuit level.