ePoster

How cerebellar architecture facilitates rapid online learning

Adriana Perez Rotondo,Dhruva Raman,Timothy O'Leary
COSYNE 2022(2022)
Lisbon, Portugal
Presented: Mar 19, 2022

Conference

COSYNE 2022

Lisbon, Portugal

Resources

Authors & Affiliations

Adriana Perez Rotondo,Dhruva Raman,Timothy O'Leary

Abstract

The cerebellum is critically involved in motor control, refining trajectories as movements are being executed. This requires fast, online learning. What features of cerebellar circuit structure make it particularly suited to online learning? The cerebellum has a distinctive circuit architecture in which each mossy fibre input typically projects to 250 granule cells, a population that comprises more than half of the neurons in the brain. Each granule cell forms ~4 synapses with mossy fibres. The main hypotheses for this sparse input expansion are that it facilitates pattern separation1 and smooth function approximation. However, we currently lack a theory that explains why this architecture is suited to online motor learning. We show that the large input expansion effectively trades time for space, allowing rapid and accurate learning in an online context. We consider a cerebellar-like network tasked with simultaneously learning an internal model of a motor system, and using this model to better control motor output. Learning online introduces a narrow time window that severely limits the information available for synaptic plasticity mechanisms to appropriately adjust synaptic weights. We find that the effect of having limited information depends on the spread of the eigenvalues Hessian of the task error. As the input expansion increases, the geometry of the error surface becomes more favourable for online learning, diminishing the effect of information error and allowing for faster learning. This suggests that the large energy cost associated with maintaining the majority of the brain’s neurons might be an inevitable cost of precise, fast, motor learning. In contrast to existing theories that argue for a role of dimensionality expansions in pattern separation, we account for a role in online learning. We provide a new framework for computing the algorithmic error introduced in online learning and show how it can be mitigated by redundant connectivity.

Unique ID: cosyne-22/cerebellar-architecture-facilitates-fb1dcdea