Resources
Authors & Affiliations
Daniel Levenstein, Aleksei Efremov, Roy Pavel Samuel Henha Eyono, Blake Richards, Adrien Peyrache
Abstract
The hippocampus is widely known for spatially-selective responses in individual cells. However, many of its cells do not have easily-interpretable spatial tuning curves or are selective for other task-relevant variables. To better understand the computations played by this structure, it may be necessary to look to population-level features of neural activity and the distributed computations they enable. Here, we focus on two such features: 1) a continuous neural manifold that maps spatial and task structure, 2) spatial representations that “sweep” ahead of the animal as it moves through the environment. While recent results indicate that learning to predict sensory inputs produces spatially tuned cells in artificial networks, there are multiple approaches to predictive learning and it remains unknown which, if any, can account for a continuous manifold or sweeping representations.
In this work, we train a recurrent neural network to predict egocentric sensory input while navigating a virtual environment. We find that, unlike simple next-step prediction, predicting sequences of input produces a population-level manifold with spatially-tuned cells, and that a rollout-based approach forms this cognitive map with less sensory data and representational dynamics that sweep ahead of the agent during movement. Further, we find that the emergence of a continuous manifold is critical for the network’s ability to produce simulated “replays” in the absence of input, which, like the hippocampus, have a range of similarity to experienced trajectories in the environment. These (online and offline) simulations can be directed by a fictitious action signal, suggesting a means by which upstream regions can use the hippocampal map for learning effective behavioral strategies or planning. These results suggest that sequential predictive learning underlies hippocampal representation and replay, and that representational sweeps reflect a data-efficient algorithm for sequential predictive learning in the brain.