ePoster

Input correlations impede suppression of chaos and learning in balanced rate networks

Rainer Engelken,Alessandro Ingrosso,Ramin Khajeh,Sven Goedeke,Larry Abbott
COSYNE 2022(2022)
Lisbon, Portugal
Presented: Mar 17, 2022

Conference

COSYNE 2022

Lisbon, Portugal

Resources

Authors & Affiliations

Rainer Engelken,Alessandro Ingrosso,Ramin Khajeh,Sven Goedeke,Larry Abbott

Abstract

Cortical circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of the internally-generated chaotic variability, strongly depends on correlations in the input: One might expect that driving all neurons with a common input helps to control network dynamics. Surprisingly, we find that the network is far easier to control with independent inputs into each neuron. We discover that this discrepancy is explained by the dynamic cancellation of a common external input by recurrent feedback, an effect that is absent when inputs vary independently across neurons. We present a nonstationary dynamic mean-field theory that explains how autocorrelations and the largest Lyapunov exponent depends on input frequency, recurrent coupling strength, and network size, demonstrating that the discrepancy between common and independent input increases for larger networks and in the vicinity of the chaotic transition. Furthermore, we show that uncorrelated inputs facilitate learning in balanced networks.

Unique ID: cosyne-22/input-correlations-impede-suppression-27c833da