← Back

Wide Neural Networks

Topic spotlight
TopicWorld Wide

wide neural networks

Discover seminars, jobs, and research tagged with wide neural networks across World Wide.
2 curated items1 Seminar1 ePoster
Updated over 4 years ago
2 items · wide neural networks
2 results
SeminarNeuroscience

Generalizing theories of cerebellum-like learning

Ashok Litwin Kumar
Columbia University
Mar 18, 2021

Since the theories of Marr, Ito, and Albus, the cerebellum has provided an attractive well-characterized model system to investigate biological mechanisms of learning. In recent years, theories have been developed that provide a normative account for many features of the anatomy and function of cerebellar cortex and cerebellum-like systems, including the distribution of parallel fiber-Purkinje cell synaptic weights, the expansion in neuron number of the granule cell layer and their synaptic in-degree, and sparse coding by granule cells. Typically, these theories focus on the learning of random mappings between uncorrelated inputs and binary outputs, an assumption that may be reasonable for certain forms of associative conditioning but is also quite far from accounting for the important role the cerebellum plays in the control of smooth movements. I will discuss in-progress work with Marjorie Xie, Samuel Muscinelli, and Kameron Decker Harris generalizing these learning theories to correlated inputs and general classes of smooth input-output mappings. Our studies build on earlier work in theoretical neuroscience as well as recent advances in the kernel theory of wide neural networks. They illuminate the role of pre-expansion structures in processing input stimuli and the significance of sparse granule cell activity. If there is time, I will also discuss preliminary work with Jack Lindsey extending these theories beyond cerebellum-like structures to recurrent networks.

ePoster

Influence of Learning Rules on Representation Dynamics in Wide Neural Networks

Blake Bordelon & Cengiz Pehlevan

COSYNE 2023