ePoster

Synaptic Plasticity Mechanisms Enable Incremental Learning of Spatio-Temporal Activity Patterns

Mohammad Habibabadi, Lenny Müller, Klaus Pawelzik
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Mohammad Habibabadi, Lenny Müller, Klaus Pawelzik

Abstract

Perception relies on spatio-temporal activity patterns. Also, within the brain computations plau- sibly generate and exploit temporal structures. It is, however, unknown how neuronal systems develop selectivity for spatio-temporal patterns. Furthermore, learning should preserve already acquired contents while becoming selective to new contents. It is also not known by what mech- anisms synapses could stabilize previously stored memories of spatio-temporal patterns while they remain plastic and contribute to further learning. Here initially, the mechanisms for neurons to learn these patterns are identified. Subsequently, the conditions for maintaining stable learned weight distributions during ongoing plasticity with- out the learned patterns in the input are established. Under these conditions, new patterns can be learned while preserving previously learned weight vectors, allowing incremental learning. Finally, the algorithm is demonstrated to be capable of detecting both artificial and real data, including speech data. In particular a one-layer self-supervised neural network is presented that incrementally self- organizes its synaptic efficacies such that detection of new spatio-temporal spike patterns be- comes possible while the existing selectivity for previously learned patterns is preserved. A plausible combination of Hebbian mechanisms, hetero-synaptic plasticity, and synaptic scaling is demonstrated to enable unsupervised learning of spatio-temporal input patterns by single neurons. In one layer networks acquisition of different patterns one after the other is achieved by including pre-synaptic hetero-synaptic plasticity which enforces differentiation of the output neurons’ selectivities. This, however, turned out to be not fully sufficient for incremental acqui- sition of patterns one after the other. Only if the spike patterns were not ‘frozen’ but stochastic (i.e. both, jittered in time and generated by Poisson processes) past memories are found to per- sist despite ongoing learning of new patterns. This input variability is shown to select subsets of orthogonal weight vectors and drives synaptic efficacies to a regime where synaptic scaling induces self-stabilization. Thereby this novel model provides an explanation for the stability of synapses related to preex- isting contents despite ongoing plasticity and suggests how nervous systems could incrementally learn and exploit temporally modulated spatio-temporal Poisson rate codes.

Unique ID: bernstein-24/synaptic-plasticity-mechanisms-enable-0f9c2fa2