Resources
Authors & Affiliations
Li Ji-An, Marcus Benna
Abstract
Backpropagation has been highly successful in training artificial neural networks for deep learning. In contrast, the brain's credit assignment mechanism remains elusive, and implementing effective credit assignment algorithms for multilayer networks in a biologically plausible manner has been challenging. Due to the locality constraint of synaptic plasticity, synaptic weights should only be updated using signals available at presynaptic and postsynaptic neurons, without accessing information from unconnected neurons. How a biological neuron encodes and transmits feedforward predictions and error signals simultaneously remains unclear. Here we propose a neuronal frequency multiplexing framework. In our model, a neuron has a basal dendritic compartment encoding feedforward (bottom-up) prediction signals and an apical dendritic compartment encoding feedback (top-down) error signals. Predictions are represented as low-frequency, direct current components in neuronal activations. The membrane potential of basal dendrites acts as a low-pass filter to extract these predictions from feedforward inputs. Errors are represented as high-frequency, oscillatory components in neuronal activations. The combination of low-frequency predictions and high-frequency errors reaches the soma, from where it can be propagated as a neuronal activation to both upstream and downstream neurons. Thus, a single neuron can simultaneously encode and transmit prediction and error signals by multiplexing them in the frequency domain. We demonstrate that this approach closely approximates backpropagation in fully connected neural networks trained on the MNIST dataset and convolutional neural networks trained on the CIFAR10 dataset. Overall, we present a new perspective on the role of oscillatory signals in the brain, by showing that they offer a potential mechanism for biologically plausible credit assignment.