Resources
Authors & Affiliations
Alan Lai, Shuqi Wang, Xiao-Jing Wang
Abstract
A line or a ring attractor stores active memory of an analog quantity that outlasts a transient stimulus. Such continuous attractor states can be realized by a recurrent neural network, when the connection weights are finely tuned. However, a continuum of attractor states is vulnerable to parameter variations, therefore how it can be realized robustly remains a long-standing problem (Renart et al. 2003, Sagodi et al., 2024). In this work, we empirically characterize the properties that can enhance the robustness of ring attractors in spite of weight perturbations in neural networks. To do so, we make use of a recently developed optimization technique, Sharpness-Aware Minimization (SAM), to find solutions that are robust to parameter changes. Such solutions in the optimization literature are known as flat minima: regions of parameter space where the loss does not significantly change. Networks trained with SAM are encouraged to find flat minima by explicitly penalizing solutions which are sharp (the opposite of flat solutions). In our experiments, we train a Recurrent Neural Network (RNN) with SAM to perform a task known apriori to realize ring attractor dynamics. Simulations indicate that robust networks trained with SAM are characterized by a large set of discrete fixed points contained in a highly nonlinear manifold which traces out a convoluted ring. Subsequent networks regularized to realize rings with low-dimensional linear embeddings did not realize the same degree of robustness, suggesting that nonlinear structure is needed. Notably, we find that single-neuron tuning curves with multiple peaks emerge in RNNs trained to be sufficiently robust. Our results provide a potential normative explanation for previous experimental observations of multi-peak single neuron tuning curves alongside nonlinear ring manifolds: rings which are geometrically non-trivial may be the brain’s solution to promoting structurally stable representations of continuous variables.