ePoster

Synaptic Upscaling Amplifies Chaotic Dynamics in Recurrent Networks of Rate Neurons

Farhad Razi, Fleur Zeldenrust
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Farhad Razi, Fleur Zeldenrust

Abstract

Increasing the strength of synaptic connections (synaptic upscaling) within networks of randomly connected neurons transitions the internal dynamics from stationary to a chaotic regime [1, 2]. Synaptic upscaling in neural networks improves learning speed, accuracy, and robustness [3] and enhances the dimensionality of internal dynamics in response to external stimuli [2], by boosting chaotic activity. Nonetheless, the direct link between synaptic upscaling, the dimensionality of the network activity, and and enhanced computational capacity of neural networks for learning tasks remains unclear. We hypothesize that synaptic upscaling enriches the internal dynamics of recurrent networks by amplifying chaotic activity. The increased chaotic activity enables the system to traverse a wider range of the phase space, resulting in an increased dimensionality of the network activity. Accordingly, an elevated level of chaotic activity results in a higher participation ratio in the recurrent network’s activity. This enhancement expands the neural representational space available for encoding information about external inputs. An enlarged representational space within the recurrent networks reduces overlap among neural subspaces dedicated to encoding information about various inputs, especially those with shared attributes. Consequently, the higher information in the recurrent networks facilitates improved decoding of information during learning tasks by the readout networks, ultimately resulting in higher learning efficiency. Initially, we investigate the impact of synaptic upscaling on chaotic dynamics. Prior work by Ostojic [2] utilized autocorrelation functions of firing rates as a proxy measure for quantifying chaotic activity. In this study, we employ the Lyapunov exponent, a direct metric for chaotic dynamics, that measures the divergence of nearby trajectories in phase space. Employing a recurrent network of rate neuron models (Fig. 1a), we demonstrate that as synaptic upscaling increases, the network transitions from a stable attractor to a strange attractor (Fig. 1b). We find that increasing synaptic upscaling enhances the divergence of neighboring trajectories (Fig. 1c), resulting in an increased Lyapunov exponent (Fig. 1d). Next, we will examine the relationship between chaotic activity, the network's dimensionality, information encoding, and learning tasks.

Unique ID: bernstein-24/synaptic-upscaling-amplifies-chaotic-3707018f