Resources
Authors & Affiliations
Rich Pang, Jonathan Pillow
Abstract
A central goal of modern neuroscience is to relate structure, dynamics, and computation in recurrent neural networks (RNNs), in particular those that can be created via biological learning rules. To date, most work on RNN training has focused on either efficient but non-biological rules (e.g., backprop, FORCE) or biologically plausible rules with poor learning efficiency (e.g., Hebbian learning). However, some of the fastest, strongest learning rules thought to act in vivo are not Hebbian, for instance in the fly mushroom body, where plasticity depends on high-dimensional presynaptic activity and a low-dimensional dopamine signal. Yet despite clear relevance to rapid learning, how such rules influence RNN dynamics remains little understood. Here we introduce a novel RNN architecture equipped with a fast, biologically plausible plasticity rule based exclusively on presynaptic activity and dopamine. We show that this rule can rapidly train an RNN to produce target nonlinear flow fields---including one-shot creation of fixed points supporting working memory---which we characterize via a new theory of biological RNN dynamics. Specifically, we show mathematically how dopamine in our model builds flexible flow fields from “support states”---network states where dopamine impulses occurred, in analogy to support vector machines (SVMs) in machine learning---which are bound to low-D feedback signals as a result of learning. Different dopamine schedules produce different flow fields, suggesting a source of individual variability in learning, with generalization of learned dynamics beyond their training regime determined by a kernel function. Thus, in contrast to traditional RNNs, whose links to biological learning remain unclear and whose dynamics can be challenging to analyze theoretically, our model is ideally suited to learn flexible computations via a rapid biological plasticity rule, admits a clear relationship between structure (the weight matrix) and emergent network function, and sheds new light on the role of dopamine in neural computation through dynamics.