ePoster

Sharing weights with noise-canceling anti-Hebbian plasticity

Roman Pogodin,Peter Latham
COSYNE 2022(2022)
Lisbon, Portugal

Conference

COSYNE 2022

Lisbon, Portugal

Resources

Authors & Affiliations

Roman Pogodin,Peter Latham

Abstract

Weight sharing among neurons is widely used in deep learning: convolutional networks are an obvious example, but in addition transformers need it for matrix-matrix multiplications. Without weight sharing, deep networks perform badly on hard tasks. This is potentially problematic for the brain, since weight sharing is biologically implausible - a fact that is ignored in deep learning models of brain activity, especially of visual processing, which are becoming widely used. Recently it was shown that partial weight sharing can be implemented with a "sleep phase", in which plasticity is anti-Hebbian. While the sleep phase significantly increases the performance of networks without explicit weight sharing, it has to be done often, and requires precise lateral connectivity in every layer. In this work, we propose a method for inducing weight sharing continuously during training, through noise-canceling anti-Hebbian plasticity. We find that for a common type of deep learning architecture, it's enough to share weights in a subset of the layers. Our model, which implements a form of homeostatic plasticity, makes several experimentally testable predictions.

Unique ID: cosyne-22/sharing-weights-with-noisecanceling-d8abac8b