ePoster

Input correlations impede suppression of chaos and learning in balanced rate networks

Rainer Engelkenand 4 co-authors
COSYNE 2022 (2022)
Mar 17, 2022
Lisbon, Portugal

Presentation

Mar 17, 2022

Poster preview

Input correlations impede suppression of chaos and learning in balanced rate networks poster preview

Event Information

Abstract

Cortical circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of the internally-generated chaotic variability, strongly depends on correlations in the input: One might expect that driving all neurons with a common input helps to control network dynamics. Surprisingly, we find that the network is far easier to control with independent inputs into each neuron. We discover that this discrepancy is explained by the dynamic cancellation of a common external input by recurrent feedback, an effect that is absent when inputs vary independently across neurons. We present a nonstationary dynamic mean-field theory that explains how autocorrelations and the largest Lyapunov exponent depends on input frequency, recurrent coupling strength, and network size, demonstrating that the discrepancy between common and independent input increases for larger networks and in the vicinity of the chaotic transition. Furthermore, we show that uncorrelated inputs facilitate learning in balanced networks.

Cookies

We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.