ePoster

Slow transition to chaos and robust reservoir computing in recurrent neural networks with heavy-tailed distributed synaptic weights

Yi Xie, Stefan Mihalas, Lukasz Kusmierz
COSYNE 2025(2025)
Montreal, Canada

Conference

COSYNE 2025

Montreal, Canada

Resources

Authors & Affiliations

Yi Xie, Stefan Mihalas, Lukasz Kusmierz

Abstract

The synaptic connection strengths in the brain largely follow a heavy-tailed distribution, whereas most theoretical frameworks assume a Gaussian distribution due to the mathematical complexity involved. Understanding how these heavy-tailed distributions influence neural dynamics is crucial for developing more biologically realistic models and theories that better capture the brain's properties. To bridge this gap, we provide both theoretical and computational insights into the dynamics shaped by heavy-tailed synaptic weights through mathematical derivations and case studies in recurrent neural networks (RNNs) with reservoir computing. We leverage the Levy-$\alpha$ stable distribution, a generalized distribution characterized by a stability parameter $\alpha$, which controls the heaviness of the tails, where $\alpha = 2$ corresponds to a Gaussian distribution. We mathematically demonstrate that a well-defined edge of chaos transition exists in any finite-size network with Levy-$\alpha$ stable weight distributions; this is not the case in an infinite-size network, where any perturbation ultimately expands chaotically. Furthermore, we derive a general estimate of the mean location of the transition in terms of the distribution width, applicable to any finite network size. Our simulations using autonomous RNNs further show that heavier-tailed distributions exhibit a wider region of network width near the edge of chaos, implying enhanced learning robustness with respect to exploding or diminishing gradients. As a proof of concept, we train the RNNs on an XOR task in a reservoir setup, demonstrating that heavy-tailed distributions induce robust performance with respect to parameter changes, such as distribution width. Our preliminary results show a similar but more nuanced robustness phenomenon when the RNNs are trained and tested on low-dimensional tasks commonly studied in neuroscience. These findings suggest that heavy-tailed synaptic weight distributions may contribute to the brain's robustness and adaptability in learning, particularly in the face of synaptic changes and parameter variability.

Unique ID: cosyne-25/slow-transition-chaos-robust-reservoir-966b8c43