Resources
Authors & Affiliations
Basile Confavreux,Friedemann Zenke,Everton J. Agnes,Timothy Lillicrap,Tim Vogels
Abstract
Synaptic plasticity is known to be a key player in the brain’s life-long learning abilities. However, due to experimental limitations, the nature of the local changes at individual synapses and their link with emerging network-level computations remain unclear. In theoretical work, synaptic plasticity is often modelled with unsupervised local plasticity rules. However, deriving a complete set of functional plasticity rules analytically will require divine intuition and numerous assumptions. Here, we approach the problem numerically. We show how to deduce plasticity rules in silico with supervised (meta-)learning of rules that act on large spiking neural networks solving complex tasks. We discuss how to parameterize, learn, and interpret plasticity rules. Using a rich search space encompassing most rules described in the literature, we employ an evolutionary strategy (CMA-ES) to recover rules that reliably solve the task in a biologically plausible way. We discuss the challenges in designing loss functions that combine performance and biological realism. Once candidate rules are obtained through this framework, we propose to interpret these high-dimensional rules by analyzing the covariance matrix learned along the optimization with CMA-ES. We show an example application of our approach applied to a memory formation and recall task, for which no robust and biologically plausible solutions are known to date. Preliminary analysis revealed that the learned rules used inhibitory plasticity both for stability and computation, and operated mainly via codependent terms, corroborating recent theoretical work.