computational strategies
Latest
Flexible multitask computation in recurrent networks utilizes shared dynamical motifs
Flexible computation is a hallmark of intelligent behavior. Yet, little is known about how neural networks contextually reconfigure for different computations. Humans are able to perform a new task without extensive training, presumably through the composition of elementary processes that were previously learned. Cognitive scientists have long hypothesized the possibility of a compositional neural code, where complex neural computations are made up of constituent components; however, the neural substrate underlying this structure remains elusive in biological and artificial neural networks. Here we identified an algorithmic neural substrate for compositional computation through the study of multitasking artificial recurrent neural networks. Dynamical systems analyses of networks revealed learned computational strategies that mirrored the modular subtask structure of the task-set used for training. Dynamical motifs such as attractors, decision boundaries and rotations were reused across different task computations. For example, tasks that required memory of a continuous circular variable repurposed the same ring attractor. We show that dynamical motifs are implemented by clusters of units and are reused across different contexts, allowing for flexibility and generalization of previously learned computation. Lesioning these clusters resulted in modular effects on network performance: a lesion that destroyed one dynamical motif only minimally perturbed the structure of other dynamical motifs. Finally, modular dynamical motifs could be reconfigured for fast transfer learning. After slow initial learning of dynamical motifs, a subsequent faster stage of learning reconfigured motifs to perform novel tasks. This work contributes to a more fundamental understanding of compositional computation underlying flexible general intelligence in neural systems. We present a conceptual framework that establishes dynamical motifs as a fundamental unit of computation, intermediate between the neuron and the network. As more whole brain imaging studies record neural activity from multiple specialized systems simultaneously, the framework of dynamical motifs will guide questions about specialization and generalization across brain regions.
Evolving Neural Networks
Evolution has shaped neural circuits in a very specific manner, slowly and aimlessly incorporating computational innovations that increased the chances to survive and reproduce of the newly born species. The discoveries done by the Evolutionary Developmental (Evo-Devo) biology field during the last decades have been crucial for our understanding of the gradual emergence of such innovations. In turn, Computational Neuroscience practitioners modeling the brain are becoming increasingly aware of the need to build models that incorporate these innovations to replicate the computational strategies used by the brain to solve a given task. The goal of this workshop is to bring together experts from Systems and Computational Neuroscience, Machine Learning and the Evo-Devo field to discuss if and how knowing the evolutionary history of neural circuits can help us understand the way the brain works, as well as the relative importance of learned VS innate neural mechanisms.
Computational strategies and neural correlates of probabilistic reversal learning in mice
COSYNE 2022
computational strategies coverage
3 items
Explore how computational strategies research is advancing inside Neuro.
Visit domain