ePoster

Joint timescale neural embeddings capture multi-timescale behavioral dynamics

Alexander Laddand 2 co-authors
COSYNE 2025 (2025)
Montreal, Canada

Presentation

Date TBA

Poster preview

Joint timescale neural embeddings capture multi-timescale behavioral dynamics poster preview

Event Information

Abstract

The simple decision to seek and consume food requires integrating slowly changing information about internal state with rapid sensory processes indicating food availability. Whether fast and slow components unite in a single integrating location [2] or whether they are also integrated in a more distributed fashion remains unknown. Can neural population dynamics simultaneously encode multiple timescales? Could modeling such temporal multiplexing help us understand how the brain integrates temporal information to guide appetitive behavior? To examine these questions, we recorded neural population activity from 16,340 neurons across 214 brain regions in 134 sessions in 19 mice trained to perform a novel Pavlovian reward association task, including multiple timescale components that influence licking behavior. We considered that task information across timescales may not be linearly encoded in a single population. Thus, we used contrastive learning (CEBRA [1] ) to ask whether neural dynamics with different temporal resolutions can be distilled from activity within a population of neurons in the basal ganglia. We found that short and long timescale representations were jointly embedded in a low-dimensional space that captures changes in licking behavior across fast and slow timescales. Behaviors could be decoded more accurately from this non-linear embedding than from the original data. This result confirms heterogeneous temporal structure in neural population activity and suggests that, in this task, multiple timescale information can be integrated in a low-dimensional subspace.

Cookies

We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.