ePoster

Learning Hebbian/Anti-Hebbian networks in continuous time

Henrique Reis Aguiar, Matthias Hennig
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Henrique Reis Aguiar, Matthias Hennig

Abstract

The brain computes internal representations by applying highly recurrent dynamics to feed-forward input. Such dynamics may be viewed as analogous to the inference step in latent variables models, where one usually follows the gradient of the latent posterior distribution until a stable state is reached. At the stable state, when this gradient has approximately zero norm, a parameter optimization step can be applied to slightly increase the likelihood of the current sample under the model. Here we suggest this procedure, which closely matches the expectation-maximization (EM) algorithm, can be implemented in a network with lateral inhibition and Hebbian plasticity, leading it to learn factorised representations of the input data. For instance, when trained on natural images this network learns an encoding in terms of edges, similar to the one found in the visual cortex. However, in its pure form, this neural EM-like procedure requires carefully timing the plasticity so that it is only applied at the stable state, which precludes the use of ongoing Hebbian synaptic plasticity. Here we show a modification of Hebbian plasticity that allows factorized representation learning without waiting for recurrent dynamics to settle. We achieve this by adding separate dynamics that restrict plasticity events to periods of high postsynaptic activity. Overall, we show how fundamental representation learning capabilities can be achieved in recurrent neuronal networks through biologically plausible mechanisms.

Unique ID: bernstein-24/learning-hebbiananti-hebbian-networks-7f69684f